If the app is launched from LockedCameraCapture and if the settings button is tapped, I need to launch the main app.
CameraViewController:
func settingsButtonTapped() {
#if isLockedCameraCaptureExtension
//App is launched from Lock Screen
//Launch main app here...
#else
//App is launched from Home Screen
self.showSettings(animated: true)
#endif
}
In this document:
https://developer.apple.com/documentation/lockedcameracapture/creating-a-camera-experience-for-the-lock-screen
Apple asks you to use:
func launchApp(with session: LockedCameraCaptureSession, info: String) {
Task {
do {
let activity = NSUserActivityTypeLockedCameraCapture
activity.userInfo = [UserInfoKey: info]
try await session.openApplication(for: activity)
} catch {
StatusManager.displayError("Unable to open app - \(error.localizedDescription)")
}
}
}
However, the documentation states that this should be placed within the extension code - LockedCameraCapture. If I do that, how can I call that all the way down from the main app's CameraViewController?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
I have the main app that saves preferences to UserDefaults.standard. So I have this one preference that the user is able to toggle - isRawOn
UserDefaults.standard.set(self.isRawOn, forKey: "isRawOn")
Now, I have LockedCameraCaptureExtension which is required know if that above setting on or off during launch. Also if it's toggled within the extension, the main app should know about it on the next launch.
The main app and the extension runs on separate containers and the preferences are not shared due to privacy reasons.
Apple mentions of using appContext of CameraCaptureIntent, but not sure how above scenario is possible through that....unless I am missing something.
Apple Reference
What I have for CameraCaptureIntent:
@available(iOS 18, *)
struct LaunchMyAppControlIntent: CameraCaptureIntent {
typealias AppContext = MyAppContext
static let title: LocalizedStringResource = "LaunchMyAppControlIntent"
static let description = IntentDescription("Capture photos with MyApp.")
@MainActor
func perform() async throws -> some IntentResult {
.result()
}
}
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
iOS
Photos and Imaging
PhotoKit
AVFoundation
Prior to iOS 26, this successfully gave me a modal view with a transparent background:
let settingsVC = MySettingsViewController()
settingsVC.modalPresentationStyle = .automatic
//settingsVC.modalPresentationStyle = .overCurrentContext
self.present(settingsVC, animated: true, completion: {
}
MySettingsViewController:
self.view.backgroundColor = UIColor(white: 0, alpha: 0.5)
Now in iOS 26, modal view is presented in a opaque grey background.
let glassView = UIVisualEffectView(effect: UIGlassEffect(style: .clear))
glassView.frame = CGRect(x: 100, y: 200, width: 200, height: 400)
self.view.addSubview(glassView)
Though UIGlassEffect has two variants: .regular and .clear, even the clear one has some blur on the background.
Is there a way to do get absolute no blur? Edges still have the glass effect.
Apple does this in two places:
Camera app:
Text magnifier:
My implementation of LockedCameraCapture does not launch my app when tapped from locked screen. But when the same widget is in the Control Center, it launches the app successfully.
Standard Xcode target template:
Lock_Screen_Capture.swift
@main
struct Lock_Screen_Capture: LockedCameraCaptureExtension {
var body: some LockedCameraCaptureExtensionScene {
LockedCameraCaptureUIScene { session in
Lock_Screen_CaptureViewFinder(session: session)
}
}
}
Lock_Screen_CaptureViewFinder.swift:
import SwiftUI
import UIKit
import UniformTypeIdentifiers
import LockedCameraCapture
struct Lock_Screen_CaptureViewFinder: UIViewControllerRepresentable {
let session: LockedCameraCaptureSession
var sourceType: UIImagePickerController.SourceType = .camera
init(session: LockedCameraCaptureSession) {
self.session = session
}
func makeUIViewController(context: Self.Context) -> UIImagePickerController {
let imagePicker = UIImagePickerController()
imagePicker.sourceType = sourceType
imagePicker.mediaTypes = [UTType.image.identifier, UTType.movie.identifier]
imagePicker.cameraDevice = .rear
return imagePicker
}
func updateUIViewController(_ uiViewController: UIImagePickerController, context: Self.Context) {
}
}
Then I have my widget:
struct CameraWidgetControl: ControlWidget {
var body: some ControlWidgetConfiguration {
StaticControlConfiguration(
kind: "com.myCompany.myAppName.lock-screen") {
ControlWidgetButton(action: MyAppCaptureIntent()) {
Label("Capture", systemImage: "camera.shutter.button.fill")
}
}
}
}
My AppIntent:
struct MyAppContext: Codable {}
struct MyAppCaptureIntent: CameraCaptureIntent {
typealias AppContext = MyAppContext
static let title: LocalizedStringResource = "MyAppCaptureIntent"
static let description = IntentDescription("Capture photos and videos with MyApp.")
@MainActor
func perform() async throws -> some IntentResult {
.result()
}
}
The Issue
LockedCameraCapture Widget does not launch my app when tapped from locked screen. You get the Face ID prompt and takes you to just Home Screen. But when the same widget is in the Control Center, it launches the app successfully.
Error Message
When tapped on Lock Screen, I get the following error code:
LaunchServices: store ‹private > or url ‹private > was nil: Error Domain=NSOSStatusErrorDomain Code=-54 "process may not map database"
UserInfo=&NSDebugDescription=process may not map database, _LSLine=72, _LSFunction=_LSServer_GetServerStoreForConnectionWithCompletionHandler}
Attempt to map database failed: permission was denied. This attempt will not be retried.
Failed to initialize client context with error Error Domain=NSOSStatusErrorDomain Code=-54 "process may not map database"
UserInfo=&NSDebugDescription=process may not map database, _LSLine=72, _LSFunction=_LSServer_GetServerStoreForConnectionWithCompletionHandler}
Things I tried
Widget image displays correctly
App ID and the Provisioning Profile seem to be fine since they work fine when the same code injected in to AVCam sample app and when used the same App ID's.
AppIntent file contains the target memberships of the Lock Screen capture and Widget
Apple compiles without errors or warnings.
I am following the Apple sample code and trying to add a manual focus lens position slider:
@available(iOS 18.0, *)
private func addCameraControls() {
if !self.session.controls.isEmpty {
for control in self.session.controls {
self.session.removeControl(control)
}
}
self.cameraControlFocusSlider = nil
//Focus Slider
if self.videoDevice!.isLockingFocusWithCustomLensPositionSupported {
self.cameraControlFocusSlider = AVCaptureSlider("Focus", symbolName: "dot.square", in: 0.0...1.0)
self.cameraControlFocusSlider!.setActionQueue(self.sessionQueue) { focusValue in
//Do manual focus
}
if self.session.canAddControl(self.cameraControlFocusSlider!) {
self.session.addControl(self.cameraControlFocusSlider!)
}
}
}
So there are these AVCaptureSessionControlsDelegate methods:
final func sessionControlsDidBecomeActive(_ session: AVCaptureSession) {
print ("sessionControlsDidBecomeActive")
}
final func sessionControlsWillEnterFullscreenAppearance(_ session: AVCaptureSession) {
print ("sessionControlsWillEnterFullscreenAppearance")
}
final func sessionControlsWillExitFullscreenAppearance(_ session: AVCaptureSession) {
print ("sessionControlsWillExitFullscreenAppearance")
}
final func sessionControlsDidBecomeInactive(_ session: AVCaptureSession) {
print ("sessionControlsDidBecomeInactive")
}
So when self.cameraControlFocusSlider is presented, I have to show the current value of the lense position. Lens position can change from auto focus and also from manual focus by the user using the app UI. Is there a way to see if self.cameraControlFocusSlider is active or being used?
Please note that I will have more than one AVCaptureSlider in the final code.
On the WWDC24 session video 'Enhance your UI animations and transitions', Appls shows these new animation methods for UIKIT:
switch gesture.state {
case .changed:
UIView. animate(.interactiveSpring) {
bead.center = gesture.translation
}
case .ended:
UIView. animate(spring) {
bead.center = endOfBracelet
}
}
As of iOS 18 Beta 2, I get an error for `UIView. animate(.interactiveSpring)`
These new methods are not available yet?
I already have an iOS 17 App Intent that works with a URL:
@available(iOS 16, *)
struct MyAppIntent: AppIntent {
static let title : LocalizedStringResource = "My App Inent"
static let openAppWhenRun : Bool = true
@MainActor
func perform() async throws -> some IntentResult{
await UIApplication.shared.open(URL(string: "myapp://myappintent")!)
return .result()
}
}
Now, with iOS 18 and Control Widgets, I want to create a Control Widget button that smply opens the app with the same URL. However UIApplication code is not allowed within extensions. For this, Apple says to use OpenIntent which is shown here:
Link
Apple Sample Code from the link:
import AppIntents
struct LaunchAppIntent: OpenIntent {
static var title: LocalizedStringResource = "Launch App"
@Parameter(title: "Target")
var target: LaunchAppEnum
}
enum LaunchAppEnum: String, AppEnum {
case timer
case history
static var typeDisplayRepresentation = TypeDisplayRepresentation("Productivity Timer's app screens")
static var caseDisplayRepresentations = [
LaunchAppEnum.timer : DisplayRepresentation("Timer"),
LaunchAppEnum.history : DisplayRepresentation("History")
]
}
WWDC session video about this does not cover this particular method in detail and also this sample code is a bit confusing.
So how can I alter this code to just open the app with a URL?
With iOS 18, when you tint Home Screen, the widgets also need to pick up the tint. How do you detect this within SwiftUI?
I didn't see WWDC24 sessions addressing this.
After the session video, "Build a great Lock Screen camera capture experience", was unclear about the UI.
So do developers need to provide a whole new UI in the extension? The main UI cannot be repurposed?
WWDC23 Platform State of the Union mentioned that Volume shutter buttons to trigger the camera shutter is coming later this year. This was mentioned at 0:30:15.
Would anyone know when this will be available?
With AVFoundation, how do you set up the new 24MP capture on new iPhone 15 models?
I strongly believed it was in videodevice.activeFormat.supportedMaxPhotoDimensions array, but not.
I am trying to display HDR Images (ProRAW) within UIImageView using preferredImageDynamicRange. This was shown in a 2023 WWDC Video
let imageView = UIImageView()
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
self.imageView.clipsToBounds = true
self.imageView.isMultipleTouchEnabled = true
self.imageView.contentMode = .scaleAspectFit
self.photoScrollView.addSubview(self.imageView)
I pull the image from PHImageManager:
let options = PHImageRequestOptions()
options.deliveryMode = .highQualityFormat
options.isNetworkAccessAllowed = true
PHImageManager.default().requestImage(for: asset, targetSize: self.targetSize(), contentMode: .aspectFit, options: options, resultHandler: { image, info in
guard let image = image else {
return
}
DispatchQueue.main.async {
self.imageView.image =image
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
}
}
Issue
The image shows successfully, yet not in HDR mode (no bright specular highlights, as seen when the same image ((ProRAW) is pulled on the native camera app.
What am I missing here?
I have a child view controller added and its view gets viewSafeAreaInsetsDidChange() called every time a frame change happens. how do I avoid this?
So far I am using these:
self.viewRespectsSystemMinimumLayoutMargins = false
self.view.insetsLayoutMarginsFromSafeArea = false
self.view.preservesSuperviewLayoutMargins. = false
However, viewSafeAreaInsetsDidChange() is till being called. Is there a way to stop that?
New variants of SF Pro were introduced for iOS 16, but I do not see an update to UIFont API to use them.
https://developer.apple.com/documentation/uikit/uifont
Another thread on here specified that it could be done through UIFontDescriptor, but it doesn't seem like thee that is the official way.
I really wish there is more information on this. I am wanting to use Expanded for my iOS update.