Is there a setting in the iPhone 14's iOS that allows the flashlight (torch) remain on simultaneously with the camera taking a still shot?
If not, is there an app that can do that?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
To learn how to do peer to peer communication I downloaded the TicTacToe example into my MacStudio from:
https://developer.apple.com/documentation/network/building_a_custom_peer-to-peer_protocol
I then loaded it into Xcode, compiled it, deployed to an iPhone 14 over a Lightning cable, and ran it. I developed code for the MacStudio in an attempt to communicate with it. The MacStudio at first detected the service but for some reason it has quit doing so.
The command: dns-sd -B _services._dns-sd._udpoutputs this as one of its lines:
A/R Flags if Domain Service Type Instance Name
Add 3 17 . tcp.local. _tictactoe
(The time stamp column deleted for clarity)
This line remains the command's output even after the iPhone the TicTacToe app has been shutdown, and after in the MacStudio Xcode, and the simulator it opens, is shutdown. In an attempt to find out what application is still advertising this Bonjour service I installed Discovery from:
https://apps.apple.com/us/app/discovery-dns-sd-browser/id305441017
When run the _tictactoe service instance does show in its list also also. But when I left click this item in that list to get a detail view of it I get a perpetual spinning wheel, and just to the right of it the message "Scanning...". No information is displayed. What does that mean in regard to what program is still advertising this service? What other ways are available to find this program so it can be shut down? Is it possible that this presence in that list, and in the output of the dns-sd is just a phantom vestige of an application that is not longer running? If this is a case how can the service offering be shutdown?
I am still very new to Swift. Now I am learning about storyboards.
I added two View Controllers to the app by using the + icon in Xcode's Title Bar. Added one button to each of their screens, and then added a connectors (segues) between them, so that each button would navigate to the other screen. As soon as these connectors are added, each segue gets the error:
/Users/ ... /LaunchScreen.storyboard Launch screens may not have triggered segues.
So I figured at first that the first screen I added to the project by means of adding a View Controller must have been launch screen. So I remove the segues, I add two more screens in the same way, and make the same connections between these two new screens. The same errors appeared. It appears to me that Xcode considers every new screen added by dragging in a View Controller is a launch screen.
How do I make only the first screen the "Launch Screen"?
There is the iPhone's name that can be looked up, and altered, by navigating to:
Settings > General > About > Name
How can a Swift application access this name? It is only the device name I seek, not the username.
I used the Interface Builder to place a Switch object in a view. I ctrl dragged it into the Assistant to make its handler a method of the class the view's is in, which is itself a subclass of "UITableViewController". A dialog box appear into which I entered the function name, and select the the option to have the sender as an argument.
The result is function of the form:
@IBAction func switchStateChange(_ sender: UISwitch) {
}
Now I need to query this Switch's On/Off state from elsewhere in the program, and if necessary change its state. It is not a good solution to save the sender parameter to a global variable because then it would require the user to change this switch's state before that global variable is set.
What is needed is to identify, and lookup, this switch object to access it from anywhere in the application. How is this done?
I do not know what I did to make it so, but somehow I accidentally got Xcode Interface Builder to display a storyboard file in an undesired mode which displays the UI buttons, and text fields, in rectangles as shown in the screenshot below. How do I reverse what I did to get it out of this mode?
I am not able drag a reference object from a Storyboard screen to code in the Assistant. The expected blue line appears as the screenshot shows, but no code is created in the Assistant. This reference object is not in the Launch Screen. When the the screen this object is in is highlighted, the correct file opens in the Assistant. Dragging this object from the "Content View" folder also does not work, even though the blue line also appears.
I was recently able to do this. Something changed. What might have changed?
I am attempting to follow this example of how to create a camera app:
https://www.appcoda.com/avfoundation-swift-guide/
It describes a sequence of 4 steps, and these steps are embodied in four functions:
func prepare(completionHandler: @escaping (Error?) -> Void) {
func createCaptureSession() { }
func configureCaptureDevices() throws { }
func configureDeviceInputs() throws { }
func configurePhotoOutput() throws { }
DispatchQueue(label: "prepare").async {
do {
createCaptureSession()
try configureCaptureDevices()
try configureDeviceInputs()
try configurePhotoOutput()
}
catch {
DispatchQueue.main.async {
completionHandler(error)
}
return
}
DispatchQueue.main.async {
completionHandler(nil)
}
}
}
What is confusing me is that it appears these four steps need to be executed sequentially, but in the Dispatch Queue they are executed simultaneously because .async is used. For example farther down that webpage the functions createCaptureSession(), and configureCaptureDevices(), and the others, are defined. In createCaptureSession() the member variables self.frontCamera, and self.rearCamera, are given values which are used in configureCaptureDevices(). So configureCaptureDevices() depends on createCaptureSession() having been already executed, something that it appears cannot be depended upon if both functions are executing simultaneously in separate threads.
What then is the benefit of using DispatchQueue()? How is it assured the above example dependencies are met?
What is the label parameter of the DispatchQueue()'s initializer used for?
I have a UITextField object on a storyboard's scene. Its "Editing Did End" event is connected to the view controller's "actIPhoneName()" event handler method as shown in this screenshot.
I configured the keyboard to show its "Done" key. I expected that when this Done button is touched, that the keyboard would disappear, and the actIPhoneName() method would be called. But neither of these happen, nor does anything else. The UITextField object remains in edit mode. The breakpoint is never reached.
What must I do to make the Done keyboard key work to make the UITextField object lose its First Responder status, and call its "Editing Did End" event handler method?
I am following these directions to code picture taking:
https://www.appcoda.com/avfoundation-swift-guide/
These directions are for taking jpeg shots. But what is needed is to output to a lossless format such as DNG, PNG, or BMP.
How would those instructions be modified to output pictures in a lossless format?
Is there a tutorial similar to the one linked to above that explains how to do lossless formats?
There is a crash on the closing brace of the photo capture completion handler where commented below.
// Initiate image capture:
cameraController.captureImage(){ ( image: UIImage?, error: Error? ) in
// This closure is the completion handler, it is called when
// the picture taking is completed.
print( "camera completion handler called." )
guard let image = image else {
print(error ?? "Image capture error")
return
}
try? PHPhotoLibrary.shared().performChangesAndWait {
PHAssetChangeRequest.creationRequestForAsset(from: image)
} // crash here
}
When step through pauses on that closing brace, and the next step is taken, there is a crash. Once crashed what I see next is assembly code as shown in the below disassembly:
libsystem_kernel.dylib`:
0x1e7d4a39c <+0>: mov x16, #0x209
0x1e7d4a3a0 <+4>: svc #0x80
-> 0x1e7d4a3a4 <+8>: b.lo 0x1e7d4a3c4 ; <+40> Thread 11: signal SIGABRT
0x1e7d4a3a8 <+12>: pacibsp
0x1e7d4a3ac <+16>: stp x29, x30, [sp, #-0x10]!
0x1e7d4a3b0 <+20>: mov x29, sp
0x1e7d4a3b4 <+24>: bl 0x1e7d3d984 ; cerror_nocancel
0x1e7d4a3b8 <+28>: mov sp, x29
0x1e7d4a3bc <+32>: ldp x29, x30, [sp], #0x10
0x1e7d4a3c0 <+36>: retab
0x1e7d4a3c4 <+40>: ret
Obviously I have screwed up picture taking somewhere. I would much appreciate suggestions on what diagnostics will lead to the resolution of this problem. I can make the entire picture taking code available on request as an attachment. It is too lengthy to post here.
This handler in my photo app:
cameraController.captureImage(){ ( image: UIImage?, error: Error? ) in
// This closure is the completion handler, it is called when the picture taking is completed.
if let error = error {
print( "camera completion handler called with error: " + error.localizedDescription )
}else{
print( "camera completion handler called." )
}
// If no image was captured return here:
guard let image = image else {
print(error ?? "Image capture error")
return
}
var phpPhotoLib = PHPhotoLibrary.shared()
if( phpPhotoLib.unavailabilityReason.debugDescription != "nil" ){
print( "phpPhotoLib.unavailabilityReason.debugDescription = " + phpPhotoLib.unavailabilityReason.debugDescription )
}else{
// There will be a crash right here unless the key "Privacy - Photo Library Usage Description" exists in this app's .plist file
try? phpPhotoLib.performChangesAndWait{
PHAssetChangeRequest.creationRequestForAsset(from: image)
}
}
}
asks the user for permission to store the captured photo data in the library. It does this once. It does not ask on subsequent photo captures. Is there a way for app to bypass this user permission request dialog, or at least present it to the user in advance of triggering a photo capture when the app first runs?
In this situation the camera is being used to capture science data in a laboratory setting. The scientist using this app already expects, and needs, the photo to be stored. The redundant permission request dialog makes the data collection cumbersome, and inconvenient, due to the extra effort required to reach the camera where it is located.
I listed the AVCapturePhotoSettings.availablePhotoPixelFormatTypes array in my iPhone 14 during a running photo session and I got these type numbers:
875704422
875704438
1111970369
I have no idea what these numbers mean. How can I use these numbers to look up a human readable string that can tell me what these types are in a way I am familiar with, such as jpeg, tiff, png, bmp, dng, etc, so I know which of these numbers to choose when I instantiate the class: AVCaptureSession?
I followed the directions to capture, and store uncompressed (raw), image files at:
https://developer.apple.com/documentation/avfoundation/photo_capture/capturing_photos_in_raw_and_apple_proraw_formats
I excluded the thumbnails because they are not needed. When picture taking is attempted the completion handling callback method:
RAWCaptureDelegate.photoOutput( _ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?)
is called as expected, but called with its "error" parameter set to the value:
Error Domain=AVFoundationErrorDomain Code=-11803 "Cannot Record" UserInfo={AVErrorRecordingFailureDomainKey=3, NSLocalizedDescription=Cannot Record, NSLocalizedRecoverySuggestion=Try recording again.}
I do not know what to do about this error. The error message recommends trying again, but the error repeats every time.
Any suggestions on possible causes, or how to proceed to find out what is going wrong, would be much appreciated.
The device this is run on is an iPhone 14 pro.
I am developing an app that is to be runnable on both Mac Studio, and iPhone 14. The app instance on the Mac communicates with the app's other instance in the iPhone 14.
To debug the Bonjour communications I need to run this source code in the debugger in both the Mac Studio simulator, and at the iPhone, simultaneously. The ideal would be to able to step through source code, and set break points, in each app instance simultaneously. Is that possible? If so how?