In the past, I use UIApplication.shared.windows.first?.safeAreaInsets.bottom to get the bottom safe area, but in iOS 15, there is an warning shows that windows has been deprecated.
My project is written in SwiftUI, so is there a way to get the global safeAreaInsets again and fits iOS 15?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
App shortcuts can only be added to Shortcut Action List, but not a separate App Shortcuts appears at the bottom of the Shortcuts app.
Now it only has the default Voice Memo App.
It successfully appeared in beta 2/3, but I'm not sure if it appeared in (beta 3 update).
But in beta 4, it disappeared. I have no idea how to make it visible again!!!
Menu {
ShareLink(...)
ShareLink(...)
ShareLink(...)
} label: {
Label("Share", systemImage: "square.and.arrow.up")
}
Here, I have different share options to choose from, and if I tap one of these share link to share sheet doesn't pop up.
Is there any workaround to this issue?
Description
In Live Activities, we saw many beautiful animations that powered by .numericText() like Text(time, style: .timer), or even Text with .contentTransition(.numericText()) applied.
But it seems like in normal SwiftUI View, these beautiful animations are gone. Instead, we saw a blinking result or fade-in-out result.
Is that exactly right?
ScreenShots
In Live Activity:
In normal SwiftUI View:
Error Code: error build: Command CompileSwift failed with a nonzero exit code
My Code:
.backgroundTask(.appRefresh("checkValidity")) {
// scheduleAppRefresh()
// checkRecords()
}
In iOS 16, the primary action of the Menu cannot be triggered. The Menu itself will always pops up instead of performing the primary action.
I have tried to use iOS 15 simulator to preview the same code. It works well.
In the session, I have seen a lot of examples of rainbow effects and I'd like to bring it into my app written by SwiftUI, but one thing missing is that I'm not sure how to implement customizable markdown analysis and how to set the style for it. Please help me, maybe give me some ideas? or just some suggestions.
I have tried a lot of codes to see what’s the difference between browser and editor, but I can’t find some changes between those two things, so i’m wondering what’s the difference between browser and editor?
Here is the implementation.
import SwiftUI
import PencilKit
struct DrawingCanvas_bug: UIViewRepresentable {
typealias UIViewType = PKCanvasView
private let toolPicker = PKToolPicker()
@Binding var drawing: PKDrawing
var isOpaque: Bool = true
var drawingDidChange: ((PKDrawing) -> Void)?
func makeUIView(context: Context) -> PKCanvasView {
let canvasView = PKCanvasView(frame: .zero)
canvasView.drawing = drawing
canvasView.delegate = context.coordinator
canvasView.backgroundColor = .clear
canvasView.isOpaque = isOpaque
canvasView.alwaysBounceVertical = true
// The size of the canvas is Zero, so I cannot update the ZoomScale now.
// context.coordinator.updateZoomScale(for: canvasView)
toolPicker.setVisible(true, forFirstResponder: canvasView)
toolPicker.addObserver(canvasView)
canvasView.becomeFirstResponder()
return canvasView
}
func updateUIView(_ canvasView: PKCanvasView, context: Context) {
DispatchQueue.main.async {
// We can get the correct ZoomScale and the correct ContentSize, but part of the drawing is not visible.
// Using the select tool can select them.
context.coordinator.updateZoomScale(for: canvasView)
}
}
func makeCoordinator() -> Coordinator {
Coordinator(self)
}
static func dismantleUIView(_ canvasView: PKCanvasView, coordinator: Coordinator) {
canvasView.resignFirstResponder()
}
}
extension DrawingCanvas_bug {
class Coordinator: NSObject, PKCanvasViewDelegate {
var host: DrawingCanvas_bug
init(_ host: DrawingCanvas_bug) {
self.host = host
}
func canvasViewDrawingDidChange(_ canvasView: PKCanvasView) {
host.drawing = canvasView.drawing
if let action = host.drawingDidChange {
action(canvasView.drawing)
}
updateContentSizeForDrawing(for: canvasView)
}
func updateZoomScale(for canvasView: PKCanvasView) {
let canvasScale = canvasView.bounds.width / 768
canvasView.minimumZoomScale = canvasScale
canvasView.maximumZoomScale = canvasScale
canvasView.zoomScale = canvasScale
updateContentSizeForDrawing(for: canvasView)
}
func updateContentSizeForDrawing(for canvasView: PKCanvasView) {
let drawing = canvasView.drawing
let contentHeight: CGFloat
if !drawing.bounds.isNull {
contentHeight = max(canvasView.bounds.height, (drawing.bounds.maxY + 500) * canvasView.zoomScale)
} else {
contentHeight = canvasView.bounds.height
}
canvasView.contentSize = CGSize(width: 768 * canvasView.zoomScale, height: contentHeight)
}
}
}
And here is how I use:
NavigationSplitView(columnVisibility: .constant(.doubleColumn)) {
List(selection: $selection) {
ForEach(drawingModel.drawings, id: \.uuidString) {
DrawingRow(drawingData: $0)
}
}
} detail: {
DrawingView()
}
.navigationSplitViewStyle(.balanced) // <- If I use automatic style, PKCanvasView's size is correct
I created a function to add my course events to the calendar app using EventKit.
After learning the swift concurrency, I want to update my code to make the progress much faster, namely using the detached task or TaskGroup to add these events.
Synchronize code without detached task or task group:
func export_test() {
Task.detached {
for i in 0...15 {
print("Task \(i): Start")
let courseEvent = EKEvent(eventStore: eventStore)
courseEvent.title = "TEST"
courseEvent.location = "TEST LOC"
courseEvent.startDate = .now
courseEvent.endDate = .now.addingTimeInterval(3600)
courseEvent.calendar = eventStore.defaultCalendarForNewEvents
courseEvent.addRecurrenceRule(EKRecurrenceRule(recurrenceWith: .daily, interval: 1, end: nil))
do {
try eventStore.save(courseEvent, span: .futureEvents)
} catch { print(error.localizedDescription) }
print("Task \(i): Finished")
}
}
}
Doing the same thing using the TaskGroup :
func export_test() {
Task.detached {
await withTaskGroup(of: Void.self) { group in
for i in 0...15 {
group.addTask {
print("Task \(i): Start")
let courseEvent = EKEvent(eventStore: eventStore)
courseEvent.title = "TEST"
courseEvent.location = "TEST LOC"
courseEvent.startDate = .now
courseEvent.endDate = .now.addingTimeInterval(3600)
courseEvent.calendar = eventStore.defaultCalendarForNewEvents
courseEvent.addRecurrenceRule(EKRecurrenceRule(recurrenceWith: .daily, interval: 1, end: nil))
do {
try eventStore.save(courseEvent, span: .futureEvents)
} catch { print(error.localizedDescription) }
print("Task \(i): Finished")
}
}
}
}
}
The output of the TaskGroup version:
Task 0: Start
Task 1: Start
Task 2: Start
Task 4: Start
Task 3: Start
Task 5: Start
Task 6: Start
Task 7: Start
Task 0: Finished
Task 8: Start
Task 1: Finished
Task 9: Start
Sometimes, only a few tasks will been done, and others will not, or even never been started (I created 16 tasks but only printed 9 in this example). Sometimes, all of these events can be added.
In my point of view, I have created 16 child tasks in the TaskGroup.
Each child task will add one event to the Calendar. I think in this way, I can take the full advantage of the multi-core performance (maybe it's actually not. 🙃)
If I put the for-loop inside the group.addTask closure, it will always have the expected result, but in this way, we only have a single loop so the TaskGroup may no longer needed.
I'm really exhausted🙃🙃.
I'm trying to copying the Colorful Confetti effect in iMessage using SwiftUI Canvas and I am wondering how to apply 3D transformation on each particle.
I have tried to add a projectionTransform in order to apply a CATransform3D, but it rotates all the canvas, not a particular particle, which is not the effect I want.
Currently, I use the very basic ForEach(particles.indices, id: \.self) loop to create each particle and use .rotation3DEffect to apply that transformation, but it may result in a performance issue (so, I tried to use .drawingGroup()).
Is there any solutions to apply 3D transformation to a particular particle in a Canvas??
My code (using ForEach loop):
GeometryReader { proxy in
let size = proxy.size
TimelineView(.animation) { timeline in
let _: () = {
let now = timeline.date.timeIntervalSinceReferenceDate
model.update(at: now)
}()
ZStack {
ForEach(model.particles.indices, id: \.self) { index in
let particle = model.particles[index]
particle.shape
.fill(particle.color)
.rotation3DEffect(.degrees(particle.degrees), axis: (x: particle.x, y: particle.y, z: particle.z))
.frame(width: particle.frame.width, height: particle.frame.height)
.position(particle.frame.origin)
.tag(index)
}
}
.frame(width: size.width, height: size.height)
.drawingGroup()
}
.contentShape(Rectangle())
.gesture(
DragGesture(minimumDistance: 0)
.onEnded { _ in model.loadEffect(in: size) }
)
.task { model.loadEffect(in: size) }
}
I am working on a SwiftUI project and in a subview I use @FetchRequest to fetch data from CoreData. I have a menu to let user select which data to fetch, and by default I want to fetch all datas.
The problem is when user open the menu and select a category then it refetch the data, it's correct and when user close the menu, the nsPredicate that the user have given will disappear and switch to the default predicate.
I tried to write the same pattern code on 'EarthQuake' sample app, it has the same problem.
And here is the code:
List(selection: $selection) {
ListView(search: $searchText)
}
.background(toggle ? Color.red.opacity(0.01) : nil)
.toolbar {
ToolbarItem(placement: .bottomBar) {
Button {
toggle.toggle()
} label: {
Text("Toggle")
}
}
}
.searchable(text: $searchText)
ListView:
struct ListView: View {
@FetchRequest(sortDescriptors: [SortDescriptor(\.time, order: .reverse)]) // predicate: nil
private var quakes: FetchedResults<Quake>
@Binding var search: String
var body: some View {
ForEach(quakes, id: \.code) { quake in
NavigationLink(destination: QuakeDetail(quake: quake)) {
QuakeRow(quake: quake)
}
}
.onChange(of: search) { newValue in
quakes.nsPredicate = newValue.isEmpty ? nil : NSPredicate(format: "place CONTAINS %@", newValue)
}
}
}
I can filter data by typing in the search field but when I click on Toggle Button, it may refresh the ParentView which cause the @fetchRequest to update and nsPredicate return to nil.
Is this a bug? Or maybe my understand was wrong. And is there any good suggestions?
I have done the same thing in SwiftUI using UIViewRepresentable, but toolPicker doesn't show so I checked isFirstResponder property and I found that it was still false after I called canvas.becomeFirstResponder().
Check this out:
struct NoteCanvasView: UIViewRepresentable {
func makeUIView(context: Context) -> PKCanvasView {
let canvas = PKCanvasView()
canvas.drawingPolicy = .anyInput
canvas.delegate = context.coordinator.self
let toolPicker = PKToolPicker()
toolPicker.setVisible(true, forFirstResponder: canvas)
toolPicker.addObserver(canvas)
print(canvas.canBecomeFirstResponder)
canvas.becomeFirstResponder()
print(canvas.isFirstResponder)
return canvas
}
func updateUIView(_ canvas: PKCanvasView, context: Context) {
canvas.becomeFirstResponder()
}
func makeCoordinator() -> Coordinator {
Coordinator(self)
}
class Coordinator: NSObject {
var parent: NoteCanvasView
init(_ parent: NoteCanvasView) {
self.parent = parent
}
}
}
I found canvas.canBecomeFirstResponder returns true and canvas.isFirstResponder always returns false.
Is this a bug in current version of SwiftUI??
I am working on an app that allow user to taking notes, and I want to support inline editing that means users can use PencilKit feature and edit text in a single note just like the Notes app.
Is there any good idea to achieve this using SwiftUI?
How to add a custom button to the Edit Menu on both iOS and iPadOS natively using SwiftUI.
I have seen many ways to implement custom button using UIKit but wondering how to use SwiftUI to achieve the same thing.
I have never seen any modifiers or Menus about this. I guest there is no way to do that.
Any idea about this? or is it a future update for SwiftUI ??