SwiftUI's colorScheme modifier is said to be deprecated in favour of preferredColorScheme but the two work differently. See the below sample app, colorScheme changes the underlying view colour while preferredColorScheme doesn't. Is that a bug of preferredColorScheme?
import SwiftUI
struct ContentView: View {
let color = Color(light: .red, dark: .green)
var body: some View {
VStack {
HStack {
color.colorScheme(.light)
color.colorScheme(.dark)
}
HStack {
color.preferredColorScheme(.light)
color.preferredColorScheme(.dark)
}
}
}
}
#Preview {
ContentView()
}
@main struct TheApp: App {
var body: some Scene {
WindowGroup { ContentView() }
}
}
extension UIColor {
convenience init(light: UIColor, dark: UIColor) {
self.init { v in
switch v.userInterfaceStyle {
case .light: light
case .dark: dark
case .unspecified: fatalError()
@unknown default: fatalError()
}
}
}
}
extension Color {
init(light: Color, dark: Color) {
self.init(UIColor(light: UIColor(light), dark: UIColor(dark)))
}
}
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
hello,
how do i create a virtual microphone on macOS that can be selected as a default input device in System Settings or in apps like FaceTime / QuickTime Player / Skype, etc?
is Audio HAL plugin the way to go?
i've seen this macOS 10.15 note: "Legacy Core Audio HAL audio hardware plug-ins are no longer supported. Use Audio Server plug-ins for audio drivers." though i am not sure if that's applicable, as i can think of these interpretations:
1 "Legacy Core Audio HAL audio hardware plug-ins are no longer supported (but you can still use non-legacy ones.)
2 "Legacy Core Audio HAL audio hardware plug-ins are no longer supported." (but you can still use non-hardware ones".)
3 "Legacy Core Audio HAL audio hardware plug-ins are no longer supported". (if you used that functionality to implement audio hardware drivers then your you can use Audio Server plug-ins instead, otherwise you are screwed.)
The "Audio Server plugin" documentation is minimalistic:
https://developer.apple.com/library/archive/qa/qa1811/_index.html
which leads to a 2013 sample code:
https://developer.apple.com/library/archive/samplecode/AudioDriverExamples/Introduction/Intro.html
and contains a "nullAudio" plugin and a kernel extension backed plugin - neither of those i wasn't able to resurrect (i'm on macOS Catalina now).
any hints?
is this a bug that NSDateFormatter knows about leap days but not about leap seconds?
let f = DateFormatter()
f.timeZone = TimeZone(identifier: "UTC")
f.dateFormat = "yyyy/MM/dd HH:mm:ss"
// last leap year
let t1 = f.date(from: "2020/02/29 00:00:00") // 2020-02-29 00:00:00 UTC
// last leap second
let t2 = f.date(from: "2016/12/31 23:59:60") // nil
Tried to ask as a comment in the other thread:
https://developer.apple.com/forums/thread/650386?answerId=628394022#reply-to-this-question
But can't leave a comment in there for some reason (the thread is locked?). Asking exactly the same question, now for iOS 15. Anything changed in this area?
When selecting a stroke path for object on PKCanvas, the option "Snap to Shape" appears.
I understand this function is still in beta and has not made available natively to other PencilKit app. Is there a way using Stroke API to call this function directly after the user hold pencil for half a second when stroke is done drawing, just like how it behaves in native apps?
how do i make TextEditor autoscrolled? i want to implement a log view based on it - when the scroll position is at bottom, adding new lines shall autoscroll it upwards so the newly added lines are visible. and when the scroll position is not at bottom - adding new lines shall not autoscroll it.