I'm a new app developer and am trying to add a button that adds pictures from the photo library AND camera. I added the first function (adding pictures from the photo library) using the new-ish photoPicker, but I can't find a way to do the same thing for the camera. Should I just tough it out and use the UI View Controller struct that I've seen in all of the YouTube tutorials I've come across?
I also want the user to be able to crop the picture in the app after they take a picture.
Thanks in advance
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
for a while i had one photo widget (no special app, just the standard apple one) and it was set to shuffle to an album of pics of my bf. no problems at all. a few weeks later i added one to shuffle through an album of pics of my cat, and that one worked fine, but it made the one of my bf stop working, and it just showed a blank white widget, no error message or anything. so i removed the one of my cat hoping the one of my bf would go back to working, and it didn’t. i only have the widgets for find my, my bank, and then apps on my home screen otherwise.
Hey, I have a complex CIFilter chain I'm trying to debug to improve processing time. Is there any documentation on what all the colours mean and the naming, e.g. sRGB Linear_to_workspace?
Thanks
Alex
Support external cameras in your iPadOS app and use Swift to read multiple camera feeds?
thanks
Hey everyone😊, I am building an app that includes a live camera feed preview. That's all I need to do along side identifying the images with createML's image classification. I don't need to capture images at all. I've seen some very complicated tutorials. I just want to use a couple of lines of code.
Topic:
Media Technologies
SubTopic:
Photos & Camera
is forKey:fileSize considered accessing non-public API?
has your app been rejected at review stage due to this?
let resources = PHAssetResource.assetResources(for: asset)
if let resource = resources.first {
if let fileSize = resource.value(forKey: "fileSize") as? Int {
return fileSize
}
}
Hello everyone, I need some help about this things.
If you also know, pls comment.
Overview
We are planning to develop an app using the “Support external cameras in your iPadOS app” feature introduced in iPadOS 17.
Before implementing this feature, it is necessary for the iPad to recognize external cameras. However, among the iPad models compatible with iPadOS 17, we have found that some of the iPads owned by our development team can recognize external cameras, while others cannot.
If you have any reports regarding compatibility issues or information on how to resolve these problems, please share them with us.
Detailed Explanation:
The results of our investigation are as follows:
External Camera Used: A 360-degree camera
Devices Firmware
RICOH Theta X 2.61.0(2024/12/26Latest)
RICOH Theta Z1
Tested iPad
Devices Firmware Status
12.9インチiPad Pro(第3世代) IOS 17.5.1 OK
11インチiPad Pro(M4) IOS 18.2 NG
Verification Method
Step 1: Power on the iPad and the external camera, ensuring both are ready for connection.
Step 2: Connect the iPad and the external camera using a USB-C cable.
Step 3: Launch FaceTime on the iPad and check the displayed camera feed.
If the external camera is recognized, the feed from the external camera will be displayed.
Our app filters the photo library to a certain date range for ease of picking photos. However, to do this, we have to require full permissions to the photo library. We would like to use the PHPickerViewController and have it filter the results by the assets creation date? This would allow us to use it.
I see other filter options, but not this one. And if it isn't there, is this something that is being thought about or on a roadmap?
Hey, I'm building a camera app and I want to use the captured HDRGainMap along side the photo to do some processing with a CIFilter chain. How can this be done? I can't find any documentation any where on this, only on how to access the HDRGainMap from an existing HEIC file, which I have done successfully. For this I'm doing something like the following:
let gainmap = CGImageSourceCopyAuxiliaryDataInfoAtIndex(source, 0, kCGImageAuxiliaryDataTypeHDRGainMap)
let gainDict = NSDictionary(dictionary: gainmap)
let gainData = gainDict[kCGImageAuxiliaryDataInfoData] as? Data
let gainDescription = gainDict[kCGImageAuxiliaryDataInfoDataDescription]
let gainMeta = gainDict[kCGImageAuxiliaryDataInfoMetadata]
However I'm not sure what the approach is with a AVCapturePhoto output from a AVCaptureDevice.
Thanks!
I am developing an iOS app with video call functionality and implementing Picture in Picture (PiP) mode for video calls. The issue I am facing is that the camera stops capturing video when the app goes to the background, even though the PiP view is still visible.
I have noticed that some apps, like Telegram, manage to keep the camera working in PiP mode while the app is in the background. How can I achieve this in my app?
Hello,
I'm building a camera app around ARKit. I've created a Lockscreen Capture Extension and added a control to initiate my camera app, but when I launch the extension I see just a black screen with no hints at any errors. Also attaching the debugger to the running process shows no logs.
Im wondering: Is LockedCameraCapture supported with ARView and ARSession?
ARKit was featured in a WWDC video with a camera app use-case, also the introduction of captureHighResolutionFrame(completion:) made me pick it up as an interesting camera app backbone - but if lockscreen capture is not possible with it I have to refactor my codebase.
I have added this extension to my app and followed the steps of the documentation but it didn't work. when I press the shortcut from the control center it only open the app and when I put it on the lockScreen it doesn't do anything. I don't know what I am missing
Topic:
Media Technologies
SubTopic:
Photos & Camera
The documentation for AVCaptureDeviceRotationCoordinator says it is designed to work with any CALayer but it seems like it is designed to work only with AVCaptureVideoPreviewLayer. Can someone confirm it is possible to make it work with other layers such as CAMetalLayer or AVSampleBufferDisplayLayer?
I am using AVMulti so the user captures two images how can I access those images if there is only one url that stores the captured images for the lockScreenCapture extension ? Plus how can I detect if the user opened the app from the extension to be able to navigate the user to the right screen ?
I'm building an app which uses the camera and want to take advantage of the ability of the builtInTripleCamera and builtInDualWideCamera to automatically switch between the ultra wide and wide angle lens to focus on close up shots.
It's working fine - except that the transition between the two lenses is a bit jumpy. I looked at what the native Camera app does and it seems to apply a small amount of blurring when the transition happens to help "mask" the jumpiness. How can I replicate this, or is there another way to improve the UX of switching between one lens and another automatically?
I am trying to use AVCaptureDevice.rotationCoordinator API to observe angles for preview and capture and it seems there is an issue with the API when used with arbitrary CALayer (which is not a AVCaptureVideoPreviewLayer) and switching cameras.
Here is my setup. The below function is defined in an actor class called CameraManager that performs setup of rotationCoordinator.
func updateRotationCoordinator(_ callback:@escaping @MainActor (CGFloat) -> Void) {
guard let device = sessionConfiguration.activeVideoInput?.device, let displayLayer = displayLayer else { return }
cancellables.removeAll()
rotationCoordinator = AVCaptureDevice.RotationCoordinator(device: device, previewLayer: displayLayer)
guard let coordinator = rotationCoordinator else { return }
coordinator.publisher(for: \.videoRotationAngleForHorizonLevelPreview)
.receive(on: DispatchQueue.main)
.sink { degrees in
let radians = degrees * .pi / 180
MainActor.assumeIsolated {
callback(radians)
}
}
.store(in: &cancellables)
}
This works the very first time but when I switch cameras and call this function again, it throws a runtime error that view's layer is modified from a non-main thread. This happens at the very line where rotation coordinator is been recreated. It's not clear why initialising rotation coordinator should modify CALayer properties right in it's init method.
Modifying properties of a view's layer off the main thread is not allowed: view <MyApp.DisplayLayerView: 0x102ffaf40> with nearest ancestor view controller <_TtGC7SwiftUI19UIHostingControllerGVS_15ModifiedContentVS_7AnyViewVS_12RootModifier__: 0x101f7fb80>; backtrace:
(
0 UIKitCore 0x0000000194a977b4 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 22509492
1 UIKitCore 0x000000019358594c 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 416076
2 QuartzCore 0x00000001927f5bd8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43992
3 QuartzCore 0x00000001927f5a4c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43596
4 QuartzCore 0x000000019283a41c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 324636
5 QuartzCore 0x000000019283a0a8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 323752
6 AVFCapture 0x00000001af072a18 09192166-E0B6-346C-B1C2-7C95C3EFF7F7 + 420376
7 MyApp.debug.dylib 0x0000000105fa3914 $s10MyApp15CapturePipelineC25updateRotationCoordinatoryyy12CoreGraphics7CGFloatVScMYccF + 972
8 MyApp.debug.dylib 0x00000001063ade40 $s10MyApp11CameraModelC18switchVideoDevicesyyYaFTY3_ + 72
9 MyApp.debug.dylib 0x0000000105fe3cbd $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TQ1_ + 1
10 MyApp.debug.dylib 0x0000000105ff06d9 $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TATQ0_ + 1
11 MyApp.debug.dylib 0x0000000105f9c595 $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTQ0_ + 1
12 MyApp.debug.dylib 0x0000000105f9fb3d $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTATQ0_ + 1
13 libswift_Concurrency.dylib 0x000000019c49fe39 E15CC6EE-9354-3CE5-AF91-F641CA8283E0 + 433721
)
There are several unknown pictures were added to my iPhone Photos. I'm developing a iOS app in swiftUI. And I'm sure that neither my developing app asked a permission to save photos to iOS device, nor these pictures are taken by myself, downloaded from the web, or synced from my iCloud. Why does this happen? Is my account or my iPhone attacked or hacked? btw, my device has turned the debug mode on, will this setting option would leads some vulnerability or some debug tools would add photos to my albums?
The attached pictures are the unknown photos in my Photos App. I'v search this photo with google. And I found that many developers in stack overflow use this picture when they encounter troubles with image display or other manipulation. Where where are this photo come from? Why does it exist in my Photo albums?
I am trying to recreate the iOS Messages app photo selection UI, where a PHPickerViewController is displayed half screen, with the message text field and a scrolling photo viewer on top. I have that UI mostly working, but cannot figure out how to allow a user to remove an image from my scrolling photo viewer (just like in the iOS Messages app).
When the picker is initially displayed, I can show selected images using the preselectedAssetIdentifiers. However, if the user taps the "x" to remove an image from the scrolling photo viewer, there is no way that I have found to update that selection in the picker.
I can dismiss/show a new picker with the animated property set to false, but that creates a very apparent bounce in the screen. Are there any ways I am missing to accomplish this?
Here is what I have so far:
Hi,
I'm using Core Graphics to load a .DNG photo shot by a Leica Q3 camera.
The photo is shot in portrait, however the embedded preview is rotated 90 degrees to landscape.
I load the photo like this:
let options = [kCGImageSourceDecodeRequest: kCGImageSourceDecodeToHDR] as CFDictionary
let source = CGImageSourceCreateWithData(data as CFData, nil)
let cgimage = CGImageSourceCreateImageAtIndex(source, 0, options)
let properties = CGImageSourceCopyPropertiesAtIndex(source, 0, nil) as? [CFString : Any]
When doing this I can see that the orientation property is 1 indicating that the orientation is 'Up', which it isn't.
If I don't specify the kCGImageSourceDecodeToHDR option (eseentially setting options to nil) - the orientation property is 8 (rotated 90 degrees).
What puzzles me is that a chang to the CGImageSourceCreateImageAtIndex call can have an influence on that latter call to CGImageSourceCopyPropertiesAtIndex ?
I would expect these to work independently?
Cheers
Thomas
I have a complex CoreImage pipeline which I'm keen to optimise. I'm aware that calling back to the CPU can have a significant impact on the performance - what is the best way to find out where this is happening?