Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

Logged error/warning in FigCaptureSourceRemote when capturing a photo
I'm using this library: https://github.com/Yummypets/YPImagePicker to capture photos. I've modified it slightly, and I'm using an older version. When testing on my iPhone 16e, ios 26, whenever I take a photo, I get the following two error messages: <<<< FigXPCUtilities >>>> signalled err=-17281 at <>:302 <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:569) - (err=-17281) These error messages appear, but as far as I can tell, the photo comes through OK, and I can save the data no problem. I've even removed all my handling code to see if it was something I was doing. I don't really want to ship with these errors showing, but I also have no idea what can be causing this error to appear. chatgpt was not helpful diagnosing this. Does anyone know what can cause this error Is there a way I can see the source code to figure out if there's something I'm doing wrong here? It really seems like this is an internal apple error, or else I would have expected more details on the error relating to the code I've written. Any clues would be appreciated!
2
2
575
2d
AVCaptureVideoPreviewLayer layoutSublayers invoked on background thread
Opening this question after discussing the issue in the AVCapture lab, hopefully so we can track down this issue. We've been noticing some crashes in App Store Connect caused by layoutSublayers being called on a background thread. After debugging the issue a bit we found that all calls which modified the AVCaptureSession or preview layer were indeed done on the main thread. It would be useful to see what results in AVCaptureVideoPreviewLayer.updateFormatDescription being called. I've attached the crashlog below. Crash log.ips - https://developer.apple.com/forums/content/attachment/800b0dba-3477-4c5a-b56c-f4cc393b384f
1
1
744
Jun ’25
IPadOS 17 external camera exposure
I'm developing iPad app that will be mostly dedicated for certain external camera for visually impaired people. The linux UVC api (e.g. using guvcview) allows to enable automatic exposure for the camera. IOs api "isExposureModeSupported" unfortunately returns false for any of the exposure modes. Is it a bug? Or perhaps AVFoundation doesn't support UVC exposure yet?
1
2
611
Jul ’25
Generating Live Photo from JPG and MOV fails
I am working on an iOS application using SwiftUI where I want to convert a JPG and a MOV file to a live photo. I am utilizing the LivePhoto Class from Github for this. The JPG and MOV files are displayed correctly in my WallpaperDetailView, but I am facing issues when trying to download the live photo to the gallery and generate the Live Photo. Here is the relevant code and the errors I am encountering: Console prints: Play button should be visible Image URL fetched and set: Optional("https://firebasestorage.googleapis.com/...") Video is ready to play Video downloaded to: file:///var/mobile/Containers/Data/Application/.../tmp/CFNetworkDownload_7rW5ny.tmp Failed to generate Live Photo I have verified that the app has the necessary permissions to access the Photo Library. The JPEG and MOV files are successfully downloaded and can be displayed in the app. The issue seems to occur when generating the Live Photo from the downloaded files. struct WallpaperDetailView: View { var wallpaper: Wallpaper @State private var isLoading = false @State private var isImageSaved = false @State private var imageURL: URL? @State private var livePhotoVideoURL: URL? @State private var player: AVPlayer? @State private var playerViewController: AVPlayerViewController? @State private var isVideoReady = false @State private var showBuffering = false var body: some View { ZStack { if let imageURL = imageURL { GeometryReader { geometry in KFImage(imageURL) .resizable() ... } } if let playerViewController = playerViewController { VideoPlayerViewController(playerViewController: playerViewController) .frame(maxWidth: .infinity, maxHeight: .infinity) .clipped() .edgesIgnoringSafeArea(.all) } } .onAppear { PHPhotoLibrary.requestAuthorization { status in if status == .authorized { loadImage() } else { print("User denied access to photo library") } } } private func loadImage() { isLoading = true if let imageURLString = wallpaper.imageURL, let imageURL = URL(string: imageURLString) { self.imageURL = imageURL if imageURL.scheme == "file" { self.isLoading = false print("Local image URL set: \(imageURL)") } else { fetchDownloadURL(from: imageURLString) { url in self.imageURL = url self.isLoading = false print("Image URL fetched and set: \(String(describing: url))") } } } if let livePhotoVideoURLString = wallpaper.livePhotoVideoURL, let livePhotoVideoURL = URL(string: livePhotoVideoURLString) { self.livePhotoVideoURL = livePhotoVideoURL preloadAndPlayVideo(from: livePhotoVideoURL) } else { self.isLoading = false print("No valid image or video URL") } } private func preloadAndPlayVideo(from url: URL) { self.player = AVPlayer(url: url) let playerViewController = AVPlayerViewController() playerViewController.player = self.player self.playerViewController = playerViewController let playerItem = AVPlayerItem(url: url) playerItem.preferredForwardBufferDuration = 1.0 self.player?.replaceCurrentItem(with: playerItem) ... print("Live Photo Video URL set: \(url)") } private func saveWallpaperToPhotos() { if let imageURL = imageURL, let livePhotoVideoURL = livePhotoVideoURL { saveLivePhotoToPhotos(imageURL: imageURL, videoURL: livePhotoVideoURL) } else if let imageURL = imageURL { saveImageToPhotos(url: imageURL) } } private func saveImageToPhotos(url: URL) { ... } private func saveLivePhotoToPhotos(imageURL: URL, videoURL: URL) { isLoading = true downloadVideo(from: videoURL) { localVideoURL in guard let localVideoURL = localVideoURL else { print("Failed to download video for Live Photo") DispatchQueue.main.async { self.isLoading = false } return } print("Video downloaded to: \(localVideoURL)") self.generateAndSaveLivePhoto(imageURL: imageURL, videoURL: localVideoURL) } } private func generateAndSaveLivePhoto(imageURL: URL, videoURL: URL) { LivePhoto.generate(from: imageURL, videoURL: videoURL, progress: { percent in print("Progress: \(percent)") }, completion: { livePhoto, resources in guard let resources = resources else { print("Failed to generate Live Photo") DispatchQueue.main.async { self.isLoading = false } return } print("Live Photo generated with resources: \(resources)") self.saveLivePhotoToLibrary(resources: resources) }) } private func saveLivePhotoToLibrary(resources: LivePhoto.LivePhotoResources) { LivePhoto.saveToLibrary(resources) { success in DispatchQueue.main.async { if success { self.isImageSaved = true print("Live Photo saved successfully") } else { print("Failed to save Live Photo") } self.isLoading = false } } } private func fetchDownloadURL(from gsURL: String, completion: @escaping (URL?) -> Void) { let storageRef = Storage.storage().reference(forURL: gsURL) storageRef.downloadURL { url, error in if let error = error { print("Failed to fetch image URL: \(error)") completion(nil) } else { completion(url) } } } private func downloadVideo(from url: URL, completion: @escaping (URL?) -> Void) { let task = URLSession.shared.downloadTask(with: url) { localURL, response, error in guard let localURL = localURL, error == nil else { print("Failed to download video: \(String(describing: error))") completion(nil) return } completion(localURL) } task.resume() } }```
1
1
799
Mar ’25
PHLivePhotoEditingContext.saveLivePhoto results in AVFoundation error -11800 "The operation could not be completed" reason An unknown error occurred (-12815)
When trying to edit some Live Photos, calling PHLivePhotoEditingContext.saveLivePhoto results in the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12815), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x300d05380 {Error Domain=NSOSStatusErrorDomain Code=-12815 "(null)"}} I was able to replicate it on my device by taking a new Live Photo. Not sure what's wrong with that one specifically, not all Live Photos replicate the issue. I've submitted FB15880825 with a sysdiagnose and a Photos Diagnostics as well. Any ideas what's going on here? It's impacting multiple customers. Thanks!
1
0
580
Jun ’25
Can you add pictures with the camera using the new photos picker instead of the old UI View Controller?
I'm a new app developer and am trying to add a button that adds pictures from the photo library AND camera. I added the first function (adding pictures from the photo library) using the new-ish photoPicker, but I can't find a way to do the same thing for the camera. Should I just tough it out and use the UI View Controller struct that I've seen in all of the YouTube tutorials I've come across? I also want the user to be able to crop the picture in the app after they take a picture. Thanks in advance
1
0
628
Dec ’24
photo album widget won’t work
for a while i had one photo widget (no special app, just the standard apple one) and it was set to shuffle to an album of pics of my bf. no problems at all. a few weeks later i added one to shuffle through an album of pics of my cat, and that one worked fine, but it made the one of my bf stop working, and it just showed a blank white widget, no error message or anything. so i removed the one of my cat hoping the one of my bf would go back to working, and it didn’t. i only have the widgets for find my, my bank, and then apps on my home screen otherwise.
1
0
650
Dec ’24
Compatibility Issue Between 360 Cameras and iPad Pro M4
Hello everyone, I need some help about this things. If you also know, pls comment. Overview We are planning to develop an app using the “Support external cameras in your iPadOS app” feature introduced in iPadOS 17. Before implementing this feature, it is necessary for the iPad to recognize external cameras. However, among the iPad models compatible with iPadOS 17, we have found that some of the iPads owned by our development team can recognize external cameras, while others cannot. If you have any reports regarding compatibility issues or information on how to resolve these problems, please share them with us. Detailed Explanation: The results of our investigation are as follows: External Camera Used: A 360-degree camera Devices Firmware RICOH Theta X 2.61.0(2024/12/26Latest) RICOH Theta Z1 Tested iPad Devices Firmware Status 12.9インチiPad Pro(第3世代) IOS 17.5.1 OK 11インチiPad Pro(M4) IOS 18.2 NG Verification Method Step 1: Power on the iPad and the external camera, ensuring both are ready for connection. Step 2: Connect the iPad and the external camera using a USB-C cable. Step 3: Launch FaceTime on the iPad and check the displayed camera feed. If the external camera is recognized, the feed from the external camera will be displayed.
1
0
510
Jan ’25
Is there a way to filter PHPickerViewController by the creation date of the assets?
Our app filters the photo library to a certain date range for ease of picking photos. However, to do this, we have to require full permissions to the photo library. We would like to use the PHPickerViewController and have it filter the results by the assets creation date? This would allow us to use it. I see other filter options, but not this one. And if it isn't there, is this something that is being thought about or on a roadmap?
1
0
563
Jan ’25
App crashes when opening camera from file input in WKWebView (iOS)
Hello everyone, I have a SwiftUI app using WKWebView to load a website that includes a form with a file input (). The issue is: 📌 When a user taps “Browse” and selects “Take Photo” (camera option), the app crashes before the camera opens. Setup Details: • App Uses SwiftUI with WKWebView • The crash occurs only when selecting “Take Photo”, but selecting an image from the library works fine. 📌 Full Code (WKWebView in SwiftUI) import SwiftUI import WebKit struct WebViewRepresentable: UIViewRepresentable { var urlString: String func makeUIView(context: Context) -> WKWebView { let webView = WKWebView() webView.configuration.allowsInlineMediaPlayback = true webView.configuration.mediaTypesRequiringUserActionForPlayback = [] loadURL(in: webView) return webView } func updateUIView(_ uiView: WKWebView, context: Context) { loadURL(in: uiView) } private func loadURL(in webView: WKWebView) { if let url = URL(string: urlString) { webView.load(URLRequest(url: url)) } } } struct ContentView: View { @State private var currentURL: String = "https://fv-wohlensee.ch" var body: some View { VStack(spacing: 0) { // Oberer Bereich in Grün Color(red: 0, green: 0.4, blue: 0) .frame(height: 50) // WebView with white background WebViewRepresentable(urlString: currentURL) .background(Color.white) Divider() // Navigation buttons HStack(spacing: 10) { Button { currentURL = "https://fv-wohlensee.ch/vereinshaus-eymatt/" } label: { VStack { Image(systemName: "house") .font(.system(size: 18)) Text("Klubhaus") .font(.system(size: 12)) .minimumScaleFactor(0.7) .lineLimit(1) } .padding(8) } .foregroundColor(.white) .frame(maxWidth: .infinity) Button { currentURL = "https://fv-wohlensee.ch/vereinsboot/" } label: { VStack { Image(systemName: "ferry.fill") .font(.system(size: 18)) Text("Boot") .font(.system(size: 12)) .minimumScaleFactor(0.7) .lineLimit(1) } .padding(8) } .foregroundColor(.white) .frame(maxWidth: .infinity) Button { currentURL = "https://fv-wohlensee.ch/aktivitaeten/" } label: { VStack { Image(systemName: "calendar") .font(.system(size: 18)) Text("Aktivitäten") .font(.system(size: 12)) .minimumScaleFactor(0.7) .lineLimit(1) } .padding(8) } .foregroundColor(.white) .frame(maxWidth: .infinity) Button { currentURL = "https://fv-wohlensee.ch/mitglied-werden/" } label: { VStack { Image(systemName: "person.badge.plus") .font(.system(size: 18)) Text("Mitglied") .font(.system(size: 12)) .minimumScaleFactor(0.7) .lineLimit(1) } .padding(8) } .foregroundColor(.white) .frame(maxWidth: .infinity) } .padding(.horizontal, 15) .padding(.vertical, 10) .background(Color(red: 0, green: 0.4, blue: 0)) } .frame(maxWidth: .infinity, maxHeight: .infinity) .background(Color(red: 0, green: 0.4, blue: 0)) .ignoresSafeArea() } } struct ContentView_Previews: PreviewProvider { static var previews: some View { ContentView() } } What I’ve Tried: 1️⃣ Checked Info.plist: Added permissions for camera and photo library: <key>NSCameraUsageDescription</key> <string>This app requires access to the camera to upload photos.</string> <key>NSPhotoLibraryUsageDescription</key> <string>This app requires access to your photo library.</string> 2️⃣ Enabled Media Capture in WKWebView: webView.configuration.allowsInlineMediaPlayback = true webView.configuration.mediaTypesRequiringUserActionForPlayback = [] 3️⃣ Tested in Safari: The same form works fine when opened in Safari. Questions: ❓ Does WKWebView need additional permissions to open the camera? ❓ Do I need to implement a delegate to handle file uploads in SwiftUI? ❓ Has anyone faced this issue and found a fix? Any guidance would be greatly appreciated! 🚀 Thanks in advance! 😊
1
1
464
Jan ’25
PIP Camera in iOS App
I am developing an iOS app with video call functionality and implementing Picture in Picture (PiP) mode for video calls. The issue I am facing is that the camera stops capturing video when the app goes to the background, even though the PiP view is still visible. I have noticed that some apps, like Telegram, manage to keep the camera working in PiP mode while the app is in the background. How can I achieve this in my app?
1
0
586
Jan ’25
LockedCameraCapture with ARKit based App
Hello, I'm building a camera app around ARKit. I've created a Lockscreen Capture Extension and added a control to initiate my camera app, but when I launch the extension I see just a black screen with no hints at any errors. Also attaching the debugger to the running process shows no logs. Im wondering: Is LockedCameraCapture supported with ARView and ARSession? ARKit was featured in a WWDC video with a camera app use-case, also the introduction of captureHighResolutionFrame(completion:) made me pick it up as an interesting camera app backbone - but if lockscreen capture is not possible with it I have to refactor my codebase.
1
0
507
Jan ’25
Unable to Capture 24MP Photos
Hello, I'm wondering how to capture 24MP photos. I'm currently testing on an iPhone 16 Pro Max. By default, the device's activeFormat supports 24MP (photo dimensions: {4032x3024, 5712x4284}). For the photoOutput, I'm setting the maxPhotoDimensions to videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject, and setting MaxPhotoQualityPrioritization to quality. When capturing, I'm applying the same maxPhotoDimensions and photoQualityPrioritization settings from the photoOutput directly to the AVCapturePhotoSettings. What could be the issue? // Objective-C // setup [self.photoOutput setMaxPhotoQualityPrioritization:AVCapturePhotoQualityPrioritizationQuality]; CMVideoDimensions maxPhotoDimensions = [(NSValue *)videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject CMVideoDimensionsValue]; [self.photoOutput setMaxPhotoDimensions:maxPhotoDimensions]; // capturing AVCapturePhotoSettings *photoSettings = [AVCapturePhotoSettings photoSettings]; photoSettings.maxPhotoDimensions = self.photoOutput.maxPhotoDimensions; photoSettings.photoQualityPrioritization = self.photoOutput.maxPhotoQualityPrioritization; [self.photoOutput capturePhotoWithSettings:photoSettings delegate:photoCaptureDelegate]; ...
1
0
578
Mar ’25
photos-navigation://album scheme
In my app SexyPeri (https://apps.apple.com/fr/app/id6738291985), I create an album with some pics Album is called SexyPeri Now, I wish to redirect the user from my app SexyPeri DIRECTLY to the album SexyPeri There is no doc about your scheme photos-navigation, or I didn't see it. Some guys retro-engineered it, but I couldn't make this work. photos-navigation://album?name=SexyPeri doesn't work. So my question is: how can I redirect to the album directly ?
1
0
384
Jan ’25
Improving the transition between ultra wide and wide angle lens
I'm building an app which uses the camera and want to take advantage of the ability of the builtInTripleCamera and builtInDualWideCamera to automatically switch between the ultra wide and wide angle lens to focus on close up shots. It's working fine - except that the transition between the two lenses is a bit jumpy. I looked at what the native Camera app does and it seems to apply a small amount of blurring when the transition happens to help "mask" the jumpiness. How can I replicate this, or is there another way to improve the UX of switching between one lens and another automatically?
1
0
474
Feb ’25
Attaching depth map manually
Hi, I have a problem when I want to attach my grayscale depth map image into the real image. The produced depth map doesn't have the cameraCalibration value which should responsible to align the depth data to the image. How do I align the depth map? I saw an article about it but it is not really detailed so I might be missing some process. I tried to: convert my depth map into pixel buffer create image destination ref and add the image there. add the auxData (depth map dict) This is the output: There is some black space there and my RGB image colour changes
1
0
395
Mar ’25