Post

Replies

Boosts

Views

Activity

diffable data source can't identify items?
i have a list like this: id: 1, image: image1, title: title1, badge: 0 id: 2, image: image2, title: title2, badge: 0 id: 3, image: image3, title: title3, badge: 0 ... is my understanding correct that in order to do a smooth "expected" animation when I want to change both the badge of the item and its order i have to manually split this "big" update into two smaller updates (first change then move, or vice versa)? this is somewhat surprising, i would expect a diffable implementation to have a notion of "identity" (in the example above it's "id") and calculate the differences based on that identity plus ite equivalence check rather than just based on the hash/equality check for the whole item.
Topic: UI Frameworks SubTopic: UIKit Tags:
11
0
8k
Jul ’20
DispatchQueue.current
hello, i don't think it is provided by the system already so i'd like to implement a smart version of DispatchQueue.async function - the one that will not reschedule the block if i am already on the queue in question and call the block directly instead in this case.  extension DispatchQueue { func asyncSmart(execute: @escaping () -> Void) { if DispatchQueue.current === self { // ????? execute() } else { async(execute: execute) } } } the immediate problem is that there is no way to get the current queue (in order to compare it with the queue parameter and do the logic branch). anyone've been through it and solved this puzzle?
10
1
1.0.0k
Jul ’20
a virtual macOS microphone
hello, how do i create a virtual microphone on macOS that can be selected as a default input device in System Settings or in apps like FaceTime / QuickTime Player / Skype, etc? is Audio HAL plugin the way to go? i've seen this macOS 10.15 note: "Legacy Core Audio HAL audio hardware plug-ins are no longer supported. Use Audio Server plug-ins for audio drivers." though i am not sure if that's applicable, as i can think of these interpretations: 1 "Legacy Core Audio HAL audio hardware plug-ins are no longer supported (but you can still use non-legacy ones.) 2 "Legacy Core Audio HAL audio hardware plug-ins are no longer supported." (but you can still use non-hardware ones".) 3 "Legacy Core Audio HAL audio hardware plug-ins are no longer supported". (if you used that functionality to implement audio hardware drivers then your you can use Audio Server plug-ins instead, otherwise you are screwed.) The "Audio Server plugin" documentation is minimalistic: https://developer.apple.com/library/archive/qa/qa1811/_index.html which leads to a 2013 sample code: https://developer.apple.com/library/archive/samplecode/AudioDriverExamples/Introduction/Intro.html and contains a "nullAudio" plugin and a kernel extension backed plugin - neither of those i wasn't able to resurrect (i'm on macOS Catalina now). any hints?
5
0
6.0k
Aug ’20
AVCaptureDevice.videoZoomFactor vanishing point wanted
before i dive into implementing this myself.. is there a way of specifying the vanishing point for a zoom operation? AVCaptureDevice.videoZoomFactor: "The device achieves a zoom effect by cropping around the center of the image captured by the sensor." looks like this is one of those one trick pony features that works in a specific case of vanishing point at the center and if i need a different vanishing point (e.g. to zoom into a corner) i have to implement that myself. FB8703018
1
0
578
Sep ’20
Audio HAL plugin start/stop issue
Hello, I am implementing my audio HAL plugin based on NullAudio sample code (the fragment attached below). I noticed that the code that implements start/stop counter is never triggered, as if the Core Audio "optimizes" something internally and only calls "start" for the very first client and it only calls stop for the very last client. ClientA starts IO // NullAudio_StartIO is called, ok ClientB starts IO // NullAudio_StartIO is not called? ClientA stops IO // NullAudio_StopIO is not called? ClientB stops IO // NullAudio_StopIO is called, ok That's a problem for me; is there any way around it? I want all starts/stops to know exact number of current clients. static OSStatus NullAudio_StartIO(AudioServerPlugInDriverRef inDriver, AudioObjectID inDeviceObjectID, UInt32 inClientID) { // This call tells the device that IO is starting for the given client. When this routine // returns, the device's clock is running and it is ready to have data read/written. It is // important to note that multiple clients can have IO running on the device at the same time. // So, work only needs to be done when the first client starts. All subsequent starts simply // increment the counter. .... // IO is already running, so just bump the counter ++gDevice_IOIsRunning; // NEVER CALLED } static OSStatus NullAudio_StopIO(AudioServerPlugInDriverRef inDriver, AudioObjectID inDeviceObjectID, UInt32 inClientID) { ... // IO is still running, so just bump the counter --gDevice_IOIsRunning; // NEVER CALLED }
0
0
1.2k
Oct ’20
mFramesPerPacket allowed values for compressed formats
documentation for the mFramesPerPacket field of AudioStreamBasicDescription has this: https://developer.apple.com/documentation/coreaudiotypes/audiostreambasicdescription/1423257-mframesperpacket?language=objc The number of frames in a packet of audio data. ... For variable bit-rate formats, the value is a larger fixed number, such as 1024 for AAC. is there a definite table somewhere that lists allowed values for codecs like AAC_ELD, Opus, etc? i know that 50 works for AAC_ELD. but that's only because i tried that particular value... maybe 49 or 51 works as well, and i just didn't try those. there must be a documented way or API way to determine allowed values.
0
0
522
Oct ’20
CallKit outgoing calls UI
i thought it is impossible to have CallKit show system UI for outgoing calls. but then i saw this: "For incoming and outgoing calls, CallKit displays the same interfaces as the Phone app..." https://developer.apple.com/documentation/callkit how do i present it though? or is this a documentation error?
2
1
1.2k
Oct ’20
Sandboxed process vs writable filesystem locations
https://developer.apple.com/library/archive/qa/qa1811/_index.html this is old, i know. can those in the know shed some light on whether the bit about temp/cached location still correct: An AudioServerPlugIn operates in a limited environment. ... Further, the host process is sandboxed. As such, an AudioServerPlugIn may only read files in its bundle in addition to the system libraries and frameworks. It may not access user documents or write to any filesystem locations other than the system's cache and temporary directories as derived through Apple API. i tried to write to temporary folder (obtained with NSTemporaryDirectory and also tried /tmp and /private/tmp) and cached folder (obtained with NSSearchPathForDirectoriesInDomains + cachesDirectory) but everything i tried resulted in permission errors.
2
0
1.3k
Nov ’20
on GCD, memory barriers, and paranoid minds
hello, i know i could be overly paranoid at times.. but is this the case that i need to protect my global memory location (that i read from / write to) with read/write memory barriers even if i'm only accessing that location from a single serial DispatchQueue? considering the fact that GCD can pick a different thread to run operations on that queue, and a further fact that different threads can run on different cores and thus have different L1/L2 caches, so that the barrier unprotected write to a location in one code invocation (that runs on my serial queue, that happens to be bound to thread1/core1 at the time) might not yet be visible to a different invocation of my code that runs a bit later on the same serial GCD queue but it so happens that now it runs on thread2/core2, so that the barrier unprotected read from that global memory location returns some stale data? is the answer any different if we consider a serial (maxConcurrentCount=1) OperationQueue instead of a serial dispatch queue? finally, is the answer any different if we consider a single NSThread / pthread instead of a serial dispatch queue? can a single thread be bound to different cores during its lifetime? (e.g. work on one core, then sleep, then awake on a different core). thank you.
1
0
1.1k
Mar ’21
SwiftUI TextEditor autoscrolling
how do i make TextEditor autoscrolled? i want to implement a log view based on it - when the scroll position is at bottom, adding new lines shall autoscroll it upwards so the newly added lines are visible. and when the scroll position is not at bottom - adding new lines shall not autoscroll it.
3
2
2.6k
Jul ’21
JSON from inputStream
is JSONSerialization.jsonObject(with: inputStream) reliable? sometimes it works fine (e.g. with small objects) and sometimes it blocks forever (easier to get the block with big objects). yet sometimes it works ok even with big objects. tried to call it on a different queue - didn't help.
13
0
3.2k
Aug ’21
DateFormatter vs leap seconds
is this a bug that NSDateFormatter knows about leap days but not about leap seconds? let f = DateFormatter() f.timeZone = TimeZone(identifier: "UTC") f.dateFormat = "yyyy/MM/dd HH:mm:ss" // last leap year let t1 = f.date(from: "2020/02/29 00:00:00") // 2020-02-29 00:00:00 UTC // last leap second let t2 = f.date(from: "2016/12/31 23:59:60") // nil
4
2
1.6k
Oct ’21
iOS Thumbnail Extension vs Network access
Is it possible to use network from within iOS Thumbnail Extension? I tried - it works fine under simulator, but I'm getting this error when running on real device: networkd_settings_read_from_file Sandbox is preventing this process from reading networkd settings file at "/Library/Preferences/com.apple.networkd.plist", please add an exception. Adding "App Transport Security Settings / Allow Arbitrary Loads" plist entry didn't help. As the error seems to be specific access to a particular file I tried adding "com.apple.security.temporary-exception.files.absolute-path.read-only" but it didn't help and looks like it couldn't help on iOS: "Note: This chapter describes property list keys specific to the macOS implementation of App Sandbox. They are not available in iOS."
3
0
2.6k
Mar ’22
Shape detection for other apps? (PencilKit, iOS 15)
Tried to ask as a comment in the other thread: https://developer.apple.com/forums/thread/650386?answerId=628394022#reply-to-this-question But can't leave a comment in there for some reason (the thread is locked?). Asking exactly the same question, now for iOS 15. Anything changed in this area? When selecting a stroke path for object on PKCanvas, the option "Snap to Shape" appears. I understand this function is still in beta and has not made available natively to other PencilKit app. Is there a way using Stroke API to call this function directly after the user hold pencil for half a second when stroke is done drawing, just like how it behaves in native apps?
1
1
1.8k
Apr ’22