Is it possible for a widget to be reloaded when the photo library changes?
Outside of Widgets, there is
PHPhotoLibraryChangeObserver
Is it possible to use something similar to update the timeline whenever there is a change in the photo library?
TimelineProvider.getTimeline(in:completion:)
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Any idea on when AppleRAW API's would be available?
Currently, I have an app that does not collect any user data. So the App privacy listing on the App Store is listed as "Does not collect any data."
However, it still receives app crashes and other reports that are available via Xcode Organizer. I believe the user enables this via the prompt "Share analytics, diagnostics, and usage information with Apple" at the system level.
So given above, do I need to change the listing to say it explicitly collects crash data?
I am trying to display HDR Images (ProRAW) within UIImageView using preferredImageDynamicRange. This was shown in a 2023 WWDC Video
let imageView = UIImageView()
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
self.imageView.clipsToBounds = true
self.imageView.isMultipleTouchEnabled = true
self.imageView.contentMode = .scaleAspectFit
self.photoScrollView.addSubview(self.imageView)
I pull the image from PHImageManager:
let options = PHImageRequestOptions()
options.deliveryMode = .highQualityFormat
options.isNetworkAccessAllowed = true
PHImageManager.default().requestImage(for: asset, targetSize: self.targetSize(), contentMode: .aspectFit, options: options, resultHandler: { image, info in
guard let image = image else {
return
}
DispatchQueue.main.async {
self.imageView.image =image
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
}
}
Issue
The image shows successfully, yet not in HDR mode (no bright specular highlights, as seen when the same image ((ProRAW) is pulled on the native camera app.
What am I missing here?
On the WWDC24 session video 'Enhance your UI animations and transitions', Appls shows these new animation methods for UIKIT:
switch gesture.state {
case .changed:
UIView. animate(.interactiveSpring) {
bead.center = gesture.translation
}
case .ended:
UIView. animate(spring) {
bead.center = endOfBracelet
}
}
As of iOS 18 Beta 2, I get an error for `UIView. animate(.interactiveSpring)`
These new methods are not available yet?
I am following the Apple sample code and trying to add a manual focus lens position slider:
@available(iOS 18.0, *)
private func addCameraControls() {
if !self.session.controls.isEmpty {
for control in self.session.controls {
self.session.removeControl(control)
}
}
self.cameraControlFocusSlider = nil
//Focus Slider
if self.videoDevice!.isLockingFocusWithCustomLensPositionSupported {
self.cameraControlFocusSlider = AVCaptureSlider("Focus", symbolName: "dot.square", in: 0.0...1.0)
self.cameraControlFocusSlider!.setActionQueue(self.sessionQueue) { focusValue in
//Do manual focus
}
if self.session.canAddControl(self.cameraControlFocusSlider!) {
self.session.addControl(self.cameraControlFocusSlider!)
}
}
}
So there are these AVCaptureSessionControlsDelegate methods:
final func sessionControlsDidBecomeActive(_ session: AVCaptureSession) {
print ("sessionControlsDidBecomeActive")
}
final func sessionControlsWillEnterFullscreenAppearance(_ session: AVCaptureSession) {
print ("sessionControlsWillEnterFullscreenAppearance")
}
final func sessionControlsWillExitFullscreenAppearance(_ session: AVCaptureSession) {
print ("sessionControlsWillExitFullscreenAppearance")
}
final func sessionControlsDidBecomeInactive(_ session: AVCaptureSession) {
print ("sessionControlsDidBecomeInactive")
}
So when self.cameraControlFocusSlider is presented, I have to show the current value of the lense position. Lens position can change from auto focus and also from manual focus by the user using the app UI. Is there a way to see if self.cameraControlFocusSlider is active or being used?
Please note that I will have more than one AVCaptureSlider in the final code.
My implementation of LockedCameraCapture does not launch my app when tapped from locked screen. But when the same widget is in the Control Center, it launches the app successfully.
Standard Xcode target template:
Lock_Screen_Capture.swift
@main
struct Lock_Screen_Capture: LockedCameraCaptureExtension {
var body: some LockedCameraCaptureExtensionScene {
LockedCameraCaptureUIScene { session in
Lock_Screen_CaptureViewFinder(session: session)
}
}
}
Lock_Screen_CaptureViewFinder.swift:
import SwiftUI
import UIKit
import UniformTypeIdentifiers
import LockedCameraCapture
struct Lock_Screen_CaptureViewFinder: UIViewControllerRepresentable {
let session: LockedCameraCaptureSession
var sourceType: UIImagePickerController.SourceType = .camera
init(session: LockedCameraCaptureSession) {
self.session = session
}
func makeUIViewController(context: Self.Context) -> UIImagePickerController {
let imagePicker = UIImagePickerController()
imagePicker.sourceType = sourceType
imagePicker.mediaTypes = [UTType.image.identifier, UTType.movie.identifier]
imagePicker.cameraDevice = .rear
return imagePicker
}
func updateUIViewController(_ uiViewController: UIImagePickerController, context: Self.Context) {
}
}
Then I have my widget:
struct CameraWidgetControl: ControlWidget {
var body: some ControlWidgetConfiguration {
StaticControlConfiguration(
kind: "com.myCompany.myAppName.lock-screen") {
ControlWidgetButton(action: MyAppCaptureIntent()) {
Label("Capture", systemImage: "camera.shutter.button.fill")
}
}
}
}
My AppIntent:
struct MyAppContext: Codable {}
struct MyAppCaptureIntent: CameraCaptureIntent {
typealias AppContext = MyAppContext
static let title: LocalizedStringResource = "MyAppCaptureIntent"
static let description = IntentDescription("Capture photos and videos with MyApp.")
@MainActor
func perform() async throws -> some IntentResult {
.result()
}
}
The Issue
LockedCameraCapture Widget does not launch my app when tapped from locked screen. You get the Face ID prompt and takes you to just Home Screen. But when the same widget is in the Control Center, it launches the app successfully.
Error Message
When tapped on Lock Screen, I get the following error code:
LaunchServices: store ‹private > or url ‹private > was nil: Error Domain=NSOSStatusErrorDomain Code=-54 "process may not map database"
UserInfo=&NSDebugDescription=process may not map database, _LSLine=72, _LSFunction=_LSServer_GetServerStoreForConnectionWithCompletionHandler}
Attempt to map database failed: permission was denied. This attempt will not be retried.
Failed to initialize client context with error Error Domain=NSOSStatusErrorDomain Code=-54 "process may not map database"
UserInfo=&NSDebugDescription=process may not map database, _LSLine=72, _LSFunction=_LSServer_GetServerStoreForConnectionWithCompletionHandler}
Things I tried
Widget image displays correctly
App ID and the Provisioning Profile seem to be fine since they work fine when the same code injected in to AVCam sample app and when used the same App ID's.
AppIntent file contains the target memberships of the Lock Screen capture and Widget
Apple compiles without errors or warnings.
let glassView = UIVisualEffectView(effect: UIGlassEffect(style: .clear))
glassView.frame = CGRect(x: 100, y: 200, width: 200, height: 400)
self.view.addSubview(glassView)
Though UIGlassEffect has two variants: .regular and .clear, even the clear one has some blur on the background.
Is there a way to do get absolute no blur? Edges still have the glass effect.
Apple does this in two places:
Camera app:
Text magnifier:
If the app is launched from LockedCameraCapture and if the settings button is tapped, I need to launch the main app.
CameraViewController:
func settingsButtonTapped() {
#if isLockedCameraCaptureExtension
//App is launched from Lock Screen
//Launch main app here...
#else
//App is launched from Home Screen
self.showSettings(animated: true)
#endif
}
In this document:
https://developer.apple.com/documentation/lockedcameracapture/creating-a-camera-experience-for-the-lock-screen
Apple asks you to use:
func launchApp(with session: LockedCameraCaptureSession, info: String) {
Task {
do {
let activity = NSUserActivityTypeLockedCameraCapture
activity.userInfo = [UserInfoKey: info]
try await session.openApplication(for: activity)
} catch {
StatusManager.displayError("Unable to open app - \(error.localizedDescription)")
}
}
}
However, the documentation states that this should be placed within the extension code - LockedCameraCapture. If I do that, how can I call that all the way down from the main app's CameraViewController?
I've been using keywords separated by commas WITHOUT spaces.What is the recommended method? WITH or WITHOUT spaces? Or there is no difference?
Using Apple Pencil's Double Tap Gesture within your app, which is not a drawing app, allowed? For example, trigger an action within the app?After reading Apple's Human Interface Guidelines, it still seems to be somewhat ambiguous. Thanks in advance.
I have a child view controller added and its view gets viewSafeAreaInsetsDidChange() called every time a frame change happens. how do I avoid this?
So far I am using these:
self.viewRespectsSystemMinimumLayoutMargins = false
self.view.insetsLayoutMarginsFromSafeArea = false
self.view.preservesSuperviewLayoutMargins. = false
However, viewSafeAreaInsetsDidChange() is till being called. Is there a way to stop that?
With iOS 18, when you tint Home Screen, the widgets also need to pick up the tint. How do you detect this within SwiftUI?
I didn't see WWDC24 sessions addressing this.
Prior to iOS 26, this successfully gave me a modal view with a transparent background:
let settingsVC = MySettingsViewController()
settingsVC.modalPresentationStyle = .automatic
//settingsVC.modalPresentationStyle = .overCurrentContext
self.present(settingsVC, animated: true, completion: {
}
MySettingsViewController:
self.view.backgroundColor = UIColor(white: 0, alpha: 0.5)
Now in iOS 26, modal view is presented in a opaque grey background.
Eventhough some apps out there figured out how to do proper burst capture that is equal to stock iOS Camera App, there is no official API to do so.Currently I have it where it wil call capturePhoto over and over, which is not fast enough. I've seen some hints floating around that the proper way to do this is using bracketed capture any input on this?Thanks in advance.