Here is a code snippet by which I attempted to set a neutral white balance:
func prepare(completionHandler: @escaping (Error?) -> Void) {
func createCaptureSession() {
self.captureSession = AVCaptureSession()
}
func configureCaptureDevices() throws {
let session = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: AVMediaType.video, position: .back)
let cameras = (session.devices.compactMap { $0 })
if cameras.isEmpty {
throw CameraControllerError.noCamerasAvailable
}
for camera in cameras {
try camera.lockForConfiguration()
camera.setWhiteBalanceModeLocked(
with whiteBalanceGains: AVCaptureDevice.WhiteBalanceGains( redGain: 1.0, greenGain: 1.0, blueGain: 1.0),
completionHandler handler: ((CMTime) -> Void)? = nil
) // Errors: Extra arguments at positions #2, #3 in call, Cannot find 'with' in scope
camera.whiteBalanceMode = .locked
camera.unlockForConfiguration()
}catch{
// To Do: Error handling here
}
}
My call to "AVCaptureDevice.WhiteBalanceGains()" gets errors. What is the correct way to do this?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
The goal is to have the background, and foreground, colors show the current state of a UIButton on a storyboard file. The attempt to do that is done with this code:
var toggleLight: Bool = false
@IBAction func toggleLight(_ sender: UIButton) {
if( toggleLight ){
toggleLight = false
sender.baseBackgroundColor = UIColor.black
sender.baseForegroundColor = UIColor.white
}else{
toggleLight = true
sender.baseBackgroundColor = UIColor.white
sender.baseForegroundColor = UIColor.black
}
tableView.reloadData()
}
I get these errors:
Value of type 'UIButton' has no member 'baseBackgroundColor'
Value of type 'UIButton' has no member 'baseForegroundColor'
I do not understand this because in file "UIKit.UIButton" I see:
extension UIButton {
...
public var baseForegroundColor: UIColor?
public var baseBackgroundColor: UIColor?
...
}
What has gone wrong there?
I have a warning message that the method AVCapturePhotoOutput.dngPhotoDataRepresentation is deprecated. What has this method been superseded by?
I am following these directions to code picture taking:
https://www.appcoda.com/avfoundation-swift-guide/
These directions are for taking jpeg shots. But what is needed is to output to a lossless format such as DNG, PNG, or BMP.
How would those instructions be modified to output pictures in a lossless format?
Is there a tutorial similar to the one linked to above that explains how to do lossless formats?
This handler in my photo app:
cameraController.captureImage(){ ( image: UIImage?, error: Error? ) in
// This closure is the completion handler, it is called when the picture taking is completed.
if let error = error {
print( "camera completion handler called with error: " + error.localizedDescription )
}else{
print( "camera completion handler called." )
}
// If no image was captured return here:
guard let image = image else {
print(error ?? "Image capture error")
return
}
var phpPhotoLib = PHPhotoLibrary.shared()
if( phpPhotoLib.unavailabilityReason.debugDescription != "nil" ){
print( "phpPhotoLib.unavailabilityReason.debugDescription = " + phpPhotoLib.unavailabilityReason.debugDescription )
}else{
// There will be a crash right here unless the key "Privacy - Photo Library Usage Description" exists in this app's .plist file
try? phpPhotoLib.performChangesAndWait{
PHAssetChangeRequest.creationRequestForAsset(from: image)
}
}
}
asks the user for permission to store the captured photo data in the library. It does this once. It does not ask on subsequent photo captures. Is there a way for app to bypass this user permission request dialog, or at least present it to the user in advance of triggering a photo capture when the app first runs?
In this situation the camera is being used to capture science data in a laboratory setting. The scientist using this app already expects, and needs, the photo to be stored. The redundant permission request dialog makes the data collection cumbersome, and inconvenient, due to the extra effort required to reach the camera where it is located.
I tried various ways to instantiate the class: AVCapturePhotoSettings as listed below:
let settings = AVCapturePhotoSettings(format: [ kCVPixelBufferPixelFormatTypeKey : "BGRA"] )
let settings = AVCapturePhotoSettings(format: [ kCVPixelBufferPixelFormatTypeKey as String : "BGRA"] )
let settings = AVCapturePhotoSettings(format: [ String( kCVPixelBufferPixelFormatTypeKey) : "BGRA"]
let settings = AVCapturePhotoSettings(format: [ "kCVPixelBufferPixelFormatTypeKey" : "BGRA"] )
The first method gets the error:
Cannot convert value of type 'CFString' to expected dictionary key type 'String'
and so the three other attempts below this were done, but each of these attempts got the run time error:
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** +[AVCapturePhotoSettings photoSettingsWithFormat:] Either kCVPixelBufferPixelFormatTypeKey or AVVideoCodecKey must be specified'
What is the correct way to use this initiator for the class: AVCapturePhotoSettings ?
The need is to capture an image in an uncompressed format, and save it in a lossless file format. I am not particular about which uncompressed format. However AVCapturePhotoOutput.availablePhotoPixelFormatTypes shows that on my iPhone 14 Pro the BGRA format is available, and its type value is: 1111970369. So I use this code to get a settings object, and trigger a photo capture with it:
let settings = AVCapturePhotoSettings(format: [ String( kCVPixelBufferPixelFormatTypeKey ) : 1111970369] )
self.photoOutput?.capturePhoto(with: settings, delegate: self )
I am not sure I have the AVCapturePhotoSettings() dictionary argument right. It is the best I could conclude from available documentation, and it does not produce a compile time error. I found no example how to do uncompress photo files, I found only one that produces jpegs, and I have successfully used that example to capture images in jpeg format. This is the URL for that example which I now attempt to modify to produce uncompressed image files: https://www.appcoda.com/avfoundation-swift-guide/ . This example is somewhat out of date, and some method names had to be changed accordingly.
Next, when the system calls the delegated callback:
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?)
I need to convert the passed photo.pixelBuffer image object into one I can pass to:
self.photoCaptureCompletionBlock?(image, nil)
Class UIImage(data: data) çannot do this alone. I get a compile time error when I pass photo.pixelBuffer to it. I tried to convert with the AVCapturePhotoOutput() class, but could not a find version of its initiators the compiler would accept to do this.
What do I need to do to get the data in photo.pixelBuffer stored in an uncompressed file?
I followed the directions to capture, and store uncompressed (raw), image files at:
https://developer.apple.com/documentation/avfoundation/photo_capture/capturing_photos_in_raw_and_apple_proraw_formats
I excluded the thumbnails because they are not needed. When picture taking is attempted the completion handling callback method:
RAWCaptureDelegate.photoOutput( _ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?)
is called as expected, but called with its "error" parameter set to the value:
Error Domain=AVFoundationErrorDomain Code=-11803 "Cannot Record" UserInfo={AVErrorRecordingFailureDomainKey=3, NSLocalizedDescription=Cannot Record, NSLocalizedRecoverySuggestion=Try recording again.}
I do not know what to do about this error. The error message recommends trying again, but the error repeats every time.
Any suggestions on possible causes, or how to proceed to find out what is going wrong, would be much appreciated.
The device this is run on is an iPhone 14 pro.
I am developing an app that is to be runnable on both Mac Studio, and iPhone 14. The app instance on the Mac communicates with the app's other instance in the iPhone 14.
To debug the Bonjour communications I need to run this source code in the debugger in both the Mac Studio simulator, and at the iPhone, simultaneously. The ideal would be to able to step through source code, and set break points, in each app instance simultaneously. Is that possible? If so how?