In my app I have a main context A and a sub-context B which has A as its parent. In context B I insert many objects that are rarely needed, so I would like to just write them to disk without keeping them in memory. Currently I call contextB.reset() which frees some memory, but those objects still exist in context A. How can I wipe them out of context A without permanently deleting them and without affecting all the other entities that I fetch using context A? I cannot call contextA.reset() because then all the entity references that I still need become invalid, but I couldn't find a way of making a context selectively forget about entities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Since March of 2020 Feedback Assistant started showing messages inside of feedbacks in the wrong order. For feedbacks where I respond shortly after a message from Apple, my response is actually shown before Apple's response, so that it is quite difficult reading those old reports. I haven't heard anything from Apple since, and this issue still happens. I keep hearing from the Apple support that Apple really cares about feedback, but why does it take so long to solve an issue that makes it so difficult handling and providing additional feedback?
I already filed a bug report for this issue in January 2020, but without a response until now. It keeps happening that while I'm typing a response to an issue, the text gets reset without reason and I have to type it all over again. Why does nobody care about fixing such annoying issues?
I'm trying to record video and audio and sending them over the network so that they can be played back in real time on other clients. I've managed to record and play back video successfully, but audio still cannot be played back (see AVAudioPlayer at the bottom of the code below). What am I doing wrong or what is missing? Thank you in advance for any input.
let captureSession = AVCaptureSession()
private func startVideoAudioFeed() {
let sessionPreset = AVCaptureSession.Preset.low
if captureSession.canSetSessionPreset(sessionPreset) {
captureSession.sessionPreset = sessionPreset
}
switch AVCaptureDevice.authorizationStatus(for: .video) {
case .notDetermined:
AVCaptureDevice.requestAccess(for: .video) { success in
self.startVideoAudioFeed()
}
case .authorized:
captureSession.beginConfiguration()
let captureVideoDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front)!
let captureVideoInput = try! AVCaptureDeviceInput(device: captureVideoDevice)
if captureSession.canAddInput(captureVideoInput) {
captureSession.addInput(captureVideoInput)
}
let captureVideoOutput = AVCaptureVideoDataOutput()
captureVideoOutput.setSampleBufferDelegate(self, queue: DispatchQueue.main)
if captureSession.canAddOutput(captureVideoOutput) {
captureSession.addOutput(captureVideoOutput)
}
captureSession.commitConfiguration()
captureSession.startRunning()
default:
break
}
switch AVCaptureDevice.authorizationStatus(for: .audio) {
case .notDetermined:
AVCaptureDevice.requestAccess(for: .audio) { success in
self.startVideoAudioFeed()
}
case .authorized:
captureSession.beginConfiguration()
let captureAudioDevice = AVCaptureDevice.default(for: .audio)!
let captureAudioInput = try! AVCaptureDeviceInput(device: captureAudioDevice)
if captureSession.canAddInput(captureAudioInput) {
captureSession.addInput(captureAudioInput)
}
let captureAudioOutput = AVCaptureAudioDataOutput()
captureAudioOutput.audioSettings = [AVFormatIDKey: kAudioFormatLinearPCM, AVNumberOfChannelsKey: NSNumber(value: 1), AVSampleRateKey: NSNumber(value: 44100)]
captureAudioOutput.setSampleBufferDelegate(self, queue: DispatchQueue.main)
if captureSession.canAddOutput(captureAudioOutput) {
captureSession.addOutput(captureAudioOutput)
}
captureSession.commitConfiguration()
default:
break
}
}
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
if let imageBuffer = sampleBuffer.imageBuffer {
let ciImage = CIImage(cvPixelBuffer: imageBuffer)
let cgImage = CIContext().createCGImage(ciImage, from: ciImage.extent)!
let data = CFDataCreateMutable(nil, 0)!
let imageDestination = CGImageDestinationCreateWithData(data, kUTTypeJPEG, 1, nil)!
CGImageDestinationAddImage(imageDestination, cgImage, [kCGImageDestinationLossyCompressionQuality: NSNumber(value: 0)] as CFDictionary)
CGImageDestinationFinalize(imageDestination)
play(data: data as Data)
} else if let dataBuffer = sampleBuffer.dataBuffer {
let data = try! dataBuffer.dataBytes()
play(data: data)
}
}
private func play(data: Data) {
if let image = CGImage(jpegDataProviderSource: CGDataProvider(data: data as CFData)!, decode: nil, shouldInterpolate: false, intent: .defaultIntent) {
				// image is a valid image
} else if let audioPlayer = try? AVAudioPlayer(data: data) {
audioPlayer.play()
				// audioPlayer is always nil with error: Error Domain=NSOSStatusErrorDomain Code=1954115647 "(null)"
}
}
By adding this code to the default SceneKit Xcode project, one can reproduce the issue (the default ship object is blurred when viewed directly by the camera, and sharp when viewed through the semi-transparent square):
cameraNode.camera!.wantsDepthOfField = true
cameraNode.camera!.focusDistance = 2
cameraNode.camera!.fStop = 0.5
let plane = SCNNode(geometry: SCNPlane(width: 1, height: 1))
plane.position = SCNVector3(x: 0.5, y: 0, z: 13)
plane.opacity = 0.5
scene.rootNode.addChildNode(plane)
Is this something expected, and is there a workaround for making objects seen through semi-transparent objects appear blurred as well?
This code strangely doesn't animate the fStop:
cameraNode.camera!.wantsDepthOfField = true
cameraNode.camera!.focusDistance = 2
let animation = CABasicAnimation(keyPath: "fStop")
animation.toValue = 0.5
animation.duration = 0.3
cameraNode.camera!.addAnimation(animation, forKey: nil)
while this one does:
SCNTransaction.begin()
SCNTransaction.animationDuration = 0.3
cameraNode.camera!.fStop = 0.5
SCNTransaction.commit()
Why?
I create a URL bookmark with URL.bookmarkData(options: [], includingResourceValuesForKeys: [.localizedNameKey]) and resolve it with NSURL(resolvingBookmarkData: bookmarkData, options: [], relativeTo: nil, bookmarkDataIsStale: nil) asURL. This works fine within my main app, but when sharing the bookmarkData via an App Group with my Share Extension, it gives the error "The file couldn't be opened because you don't have permission to view it.". Is there any way I can do this?
For a couple of years now Xcode has been showing many similar warnings when opening a storyboard file. I have opened FB8245368 more than a year ago without response. If there is a way of solving these warnings inside the storyboard without adding artificial constraints that are removed at build time?
Open the project at https://www.icloud.com/iclouddrive/0bNxa2_8jNRVFbpKsTyrB5yMg#problem
Select that warning, then click Update Frames at the bottom right of the canvas. You can keep pressing the button until the view is entirely collapsed.
It seems that when typing inside a textview that has insets and padding, the caret keeps jumping up and down on most keystrokes. Is there a solution to this?
class ViewController: UIViewController, UITextViewDelegate {
@IBOutlet weak var textView: TextView!
override func viewDidLayoutSubviews() {
let h = textView.bounds.size.height / 2
textView.textContainerInset = UIEdgeInsets(top: h, left: 0, bottom: h, right: 0)
textView.textContainer.lineFragmentPadding = 200
}
func textViewDidChange(_ textView: UITextView) {
textView.textStorage.addAttribute(.foregroundColor, value: UIColor.red, range: NSRange(location: 0, length: 1))
}
}
class TextView: UITextView {
override func caretRect(for position: UITextPosition) -> CGRect {
let r = super.caretRect(for: position)
print(r)
return r
}
}
A complete project can be found here: https://www.icloud.com/iclouddrive/0yEBQZPUCZQH1o3HiL8_6q_gQ#problem_copia
Every now and then (say at least once a week, but possibly many times within a day) when I try to deploy to my iPad Xcode shows a sheet reading "iPad is not connected", even if it was working a minute earlier. I have reported this already several months ago, but apparently there is no fix for this yet. Usually, restarting Mac and/or iPad solves the issue, but it's really annoying when I have to do this repeatedly. Is there an easier workaround?
For many months now, projects that I download from most of the feedbacks I submit don't build anymore: some settings in the Xcode project get changed for some reason at some stage so that the directory structure is wrong. The top level folder named after the project is highlighted in red and the only compiler error is about the missing entitlements file, even though all the files are in the project folder. I already reported this months ago but this is still happening. What's the problem and why is there no solution to this yet?
I want to debug my QuickLook extension for macOS. I read somewhere online that when running it in Xcode, I have to select Quick Look Simulator, but I have no idea what to do after that, and I couldn't find any official documentation.
Is it possible?
On Mac we can with NSFontManager.shared.font(withFamily:traits:weight:size:), but I couldn't find any way on iOS, not even with UIFontDescriptor.
Often performing the swipe right gesture in order to show the previously opened file doesn't work immediately. When this happens, which it does countless times a day, I have to repeat it exactly 3 times before it works.
I submitted a bug report in May 2018, 3.5 years ago, and it is still marked as "Similar reports: None" and "Resolution: Open". I asked for updates a couple of times and never received a response. This makes me angry. My brain has been rewired so that when (if) this issue is finally solved I will probably continue swiping 3 times for another 2 years. At the same time, it makes me extremely sad. How can the engineers at Apple possibly leave me in the dark about if they could reproduce the issue, are working on it, or just find it unnecessary to fix it?