Here is a code snippet by which I attempted to set a neutral white balance:
func prepare(completionHandler: @escaping (Error?) -> Void) {
func createCaptureSession() {
self.captureSession = AVCaptureSession()
}
func configureCaptureDevices() throws {
let session = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: AVMediaType.video, position: .back)
let cameras = (session.devices.compactMap { $0 })
if cameras.isEmpty {
throw CameraControllerError.noCamerasAvailable
}
for camera in cameras {
try camera.lockForConfiguration()
camera.setWhiteBalanceModeLocked(
with whiteBalanceGains: AVCaptureDevice.WhiteBalanceGains( redGain: 1.0, greenGain: 1.0, blueGain: 1.0),
completionHandler handler: ((CMTime) -> Void)? = nil
) // Errors: Extra arguments at positions #2, #3 in call, Cannot find 'with' in scope
camera.whiteBalanceMode = .locked
camera.unlockForConfiguration()
}catch{
// To Do: Error handling here
}
}
My call to "AVCaptureDevice.WhiteBalanceGains()" gets errors. What is the correct way to do this?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
The goal is to have the background, and foreground, colors show the current state of a UIButton on a storyboard file. The attempt to do that is done with this code:
var toggleLight: Bool = false
@IBAction func toggleLight(_ sender: UIButton) {
if( toggleLight ){
toggleLight = false
sender.baseBackgroundColor = UIColor.black
sender.baseForegroundColor = UIColor.white
}else{
toggleLight = true
sender.baseBackgroundColor = UIColor.white
sender.baseForegroundColor = UIColor.black
}
tableView.reloadData()
}
I get these errors:
Value of type 'UIButton' has no member 'baseBackgroundColor'
Value of type 'UIButton' has no member 'baseForegroundColor'
I do not understand this because in file "UIKit.UIButton" I see:
extension UIButton {
...
public var baseForegroundColor: UIColor?
public var baseBackgroundColor: UIColor?
...
}
What has gone wrong there?
I have a warning message that the method AVCapturePhotoOutput.dngPhotoDataRepresentation is deprecated. What has this method been superseded by?
I am following these directions to code picture taking:
https://www.appcoda.com/avfoundation-swift-guide/
These directions are for taking jpeg shots. But what is needed is to output to a lossless format such as DNG, PNG, or BMP.
How would those instructions be modified to output pictures in a lossless format?
Is there a tutorial similar to the one linked to above that explains how to do lossless formats?
This handler in my photo app:
cameraController.captureImage(){ ( image: UIImage?, error: Error? ) in
// This closure is the completion handler, it is called when the picture taking is completed.
if let error = error {
print( "camera completion handler called with error: " + error.localizedDescription )
}else{
print( "camera completion handler called." )
}
// If no image was captured return here:
guard let image = image else {
print(error ?? "Image capture error")
return
}
var phpPhotoLib = PHPhotoLibrary.shared()
if( phpPhotoLib.unavailabilityReason.debugDescription != "nil" ){
print( "phpPhotoLib.unavailabilityReason.debugDescription = " + phpPhotoLib.unavailabilityReason.debugDescription )
}else{
// There will be a crash right here unless the key "Privacy - Photo Library Usage Description" exists in this app's .plist file
try? phpPhotoLib.performChangesAndWait{
PHAssetChangeRequest.creationRequestForAsset(from: image)
}
}
}
asks the user for permission to store the captured photo data in the library. It does this once. It does not ask on subsequent photo captures. Is there a way for app to bypass this user permission request dialog, or at least present it to the user in advance of triggering a photo capture when the app first runs?
In this situation the camera is being used to capture science data in a laboratory setting. The scientist using this app already expects, and needs, the photo to be stored. The redundant permission request dialog makes the data collection cumbersome, and inconvenient, due to the extra effort required to reach the camera where it is located.
I tried various ways to instantiate the class: AVCapturePhotoSettings as listed below:
let settings = AVCapturePhotoSettings(format: [ kCVPixelBufferPixelFormatTypeKey : "BGRA"] )
let settings = AVCapturePhotoSettings(format: [ kCVPixelBufferPixelFormatTypeKey as String : "BGRA"] )
let settings = AVCapturePhotoSettings(format: [ String( kCVPixelBufferPixelFormatTypeKey) : "BGRA"]
let settings = AVCapturePhotoSettings(format: [ "kCVPixelBufferPixelFormatTypeKey" : "BGRA"] )
The first method gets the error:
Cannot convert value of type 'CFString' to expected dictionary key type 'String'
and so the three other attempts below this were done, but each of these attempts got the run time error:
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** +[AVCapturePhotoSettings photoSettingsWithFormat:] Either kCVPixelBufferPixelFormatTypeKey or AVVideoCodecKey must be specified'
What is the correct way to use this initiator for the class: AVCapturePhotoSettings ?
The need is to capture an image in an uncompressed format, and save it in a lossless file format. I am not particular about which uncompressed format. However AVCapturePhotoOutput.availablePhotoPixelFormatTypes shows that on my iPhone 14 Pro the BGRA format is available, and its type value is: 1111970369. So I use this code to get a settings object, and trigger a photo capture with it:
let settings = AVCapturePhotoSettings(format: [ String( kCVPixelBufferPixelFormatTypeKey ) : 1111970369] )
self.photoOutput?.capturePhoto(with: settings, delegate: self )
I am not sure I have the AVCapturePhotoSettings() dictionary argument right. It is the best I could conclude from available documentation, and it does not produce a compile time error. I found no example how to do uncompress photo files, I found only one that produces jpegs, and I have successfully used that example to capture images in jpeg format. This is the URL for that example which I now attempt to modify to produce uncompressed image files: https://www.appcoda.com/avfoundation-swift-guide/ . This example is somewhat out of date, and some method names had to be changed accordingly.
Next, when the system calls the delegated callback:
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?)
I need to convert the passed photo.pixelBuffer image object into one I can pass to:
self.photoCaptureCompletionBlock?(image, nil)
Class UIImage(data: data) çannot do this alone. I get a compile time error when I pass photo.pixelBuffer to it. I tried to convert with the AVCapturePhotoOutput() class, but could not a find version of its initiators the compiler would accept to do this.
What do I need to do to get the data in photo.pixelBuffer stored in an uncompressed file?
I followed the directions to capture, and store uncompressed (raw), image files at:
https://developer.apple.com/documentation/avfoundation/photo_capture/capturing_photos_in_raw_and_apple_proraw_formats
I excluded the thumbnails because they are not needed. When picture taking is attempted the completion handling callback method:
RAWCaptureDelegate.photoOutput( _ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?)
is called as expected, but called with its "error" parameter set to the value:
Error Domain=AVFoundationErrorDomain Code=-11803 "Cannot Record" UserInfo={AVErrorRecordingFailureDomainKey=3, NSLocalizedDescription=Cannot Record, NSLocalizedRecoverySuggestion=Try recording again.}
I do not know what to do about this error. The error message recommends trying again, but the error repeats every time.
Any suggestions on possible causes, or how to proceed to find out what is going wrong, would be much appreciated.
The device this is run on is an iPhone 14 pro.
I am developing an app that is to be runnable on both Mac Studio, and iPhone 14. The app instance on the Mac communicates with the app's other instance in the iPhone 14.
To debug the Bonjour communications I need to run this source code in the debugger in both the Mac Studio simulator, and at the iPhone, simultaneously. The ideal would be to able to step through source code, and set break points, in each app instance simultaneously. Is that possible? If so how?
The Info.plist file is missing from my project as shown in this screenshot:
Yet the project has properties as the screenshot does show. So obviously the properties are being remembered somewhere, and I do not know where. As can be seen no paths to an Info.plist is currently set.
I need to create a new Info.plist file, and add it to the project. My concern now is about synchronization. If I were to do this by navigating to: File => New => File => Resource => Property List, and named it Info.plist, will this new property list file automatically synchronized with all of my current settings? Or would it contain only default settings, and synchronize my project's settings with those default settings?
I suspect the original Info.plist file was lost when I decided rename folders, and filenames, and had to copy everything into a new project to do that. I failed to copy over the Info.plist file.
In TicTacToe example there is in the override method "PasscodeViewController.viewDidLoad()" this snippet of code:
if let browseResult = browseResult,
case let NWEndpoint.service(name: name, type: _, domain: _, interface: _) = browseResult.endpoint {
title = "Join \(name)"
}
What confuses me:
The use of the "case" keyword without a switch statement.
The "case" keyword does not have a constant to compare with to decide if will branch here.
And what of the method call to NWEndpoint.service() being set equal to something? Is this actually defining what the service method will do when the system calls it?
I am still very new to Swift. Now I am learning about storyboards.
I added two View Controllers to the app by using the + icon in Xcode's Title Bar. Added one button to each of their screens, and then added a connectors (segues) between them, so that each button would navigate to the other screen. As soon as these connectors are added, each segue gets the error:
/Users/ ... /LaunchScreen.storyboard Launch screens may not have triggered segues.
So I figured at first that the first screen I added to the project by means of adding a View Controller must have been launch screen. So I remove the segues, I add two more screens in the same way, and make the same connections between these two new screens. The same errors appeared. It appears to me that Xcode considers every new screen added by dragging in a View Controller is a launch screen.
How do I make only the first screen the "Launch Screen"?
There is the iPhone's name that can be looked up, and altered, by navigating to:
Settings > General > About > Name
How can a Swift application access this name? It is only the device name I seek, not the username.
I used the Interface Builder to place a Switch object in a view. I ctrl dragged it into the Assistant to make its handler a method of the class the view's is in, which is itself a subclass of "UITableViewController". A dialog box appear into which I entered the function name, and select the the option to have the sender as an argument.
The result is function of the form:
@IBAction func switchStateChange(_ sender: UISwitch) {
}
Now I need to query this Switch's On/Off state from elsewhere in the program, and if necessary change its state. It is not a good solution to save the sender parameter to a global variable because then it would require the user to change this switch's state before that global variable is set.
What is needed is to identify, and lookup, this switch object to access it from anywhere in the application. How is this done?
The need is to persist between launches the state of storyboard objects such as of type UISwitch, UITextField, etc. Can this be done using @AppStorage? If so how can @AppStorage be set to watch these?
I tried getting @AppStorage to watch an outlet class member variable that is connected to the storyboard object:
@IBOutlet weak var iPhoneName: UITextField!
@AppStorage("iPhoneName") var iPhoneName: String = ""
This got an error because the variable to be watched is already declared.
I decided to make the the watched variable different than the one connected to the Storyboard's UITextField object:
@AppStorage("iPhoneName") var striPhoneName: String = ""
and got the error: Unknown attribute 'AppStorage' . In what import library is @AppStorage defined?
If @AppStorage cannot be used for this, what is the easiest way to code storyboard object persistence? I am looking for an easy, and quick way. I am not concerned with memory usage right now.