I have a powered up, and logged in, iPhone 14 connected to a MacStudio by means of lightning cable.
This code snippet is being run in the MacStudio inside the XCode IDE:
@Published var iPhones: [IPhone] = []
var test: [EAAccessory] = []
var count = 0;
/// Update the iPhones list.
func Update(){
test = EAAccessoryManager.shared().connectedAccessories
count = test.count
iPhones = EAAccessoryManager.shared().connectedAccessories.map{ IPhone( Accessory: $0) } // ToDo: Filter for iPhone accessories only
}
The variable test is set to a value of 0 in the the Update() method. The iPhone 14 is not showing up in the test array.
On this webpage: https://developer.apple.com/documentation/externalaccessory/eaaccessorymanager
I see this:
Important
iPhone and iPad apps running on Macs with Apple silicon never receive connection notifications.
In the XCode IDE I have target set to Mac. Since I am running this in a Mac, and I have XCode set to a Mac target, I expected the result would be a Mac app running in a Max, and so I expected the above Important notice would not apply. Is my expectation correct? If not what has gone wrong here?
Does a MacStudio receive a connection notification when an iPhone 14 is connected?
Can EAAccessory methods be used in a MacStudio to communicate with an iPhone 14?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
On my Mac I made the mistake of reorganizing my development files. I renamed files folders, and application names, and moved some folders around to improve order, and better my ability to navigate the file structure. The result was a corrupted project that I could not fix all the errors on. So I created a new project that has the same name as the old, and copied the files into it. The result in the new after all the old project's files were copied to were these errors of the form:
Failed to register bundle identifier
The app identifier "com.example.com.Trial-iPhone" cannot be registered to your development team because it is not available. Change your bundle identifier to a unique string to try again.
No profiles for 'com.example.com.Trial-iPhone' were found
Xcode couldn't find any iOS App Development provisioning profiles matching 'com.IntOpSys.com.Bonjour-Trial-iPhone'.
(not the real domain name)
So I logged into my developer account at developer.apple.com to delete this certificate, and could not find a way to navigate in my account to where that is done. Is that doable? Where, or How?
I worked around the problem by appending a revision letter to the Bundle Identifier, such that it is now: com.example.com.Trial-iPhoneA . I would rather not have to do this when I already have an identifier to use, and also I do not like the idea of accumulating old, and never again used, certificates.
Is there an easy way to convert a storyboard made for iOS to one made for a Mac? Failing that, is there an easy way to convert a storyboard to SwiftUI?
On my MacStudio there are applications that start up when on login. I need to these not to be. No applications show up at:
System Settings => General => Login Items
Where else is there to look?
The startup apps that are opening are:
iPhone 14 Pro simulator
Activity Monitor
I believe the simulator is opening because I have been developing an iPhone 14 app with XCode. But once started, somehow, like the Sorcerer's Apprentice's brooms, it just keeps on appearing on its own, even if I have closed both it, and XCode, before shutdown. So now, to keep this off at startup, I need a sorcerer?
I have seen how to connect a button in a storyboard view to code that does its action by dragging that button into the assistant editor.
Now I need to do the opposite. There is a button in the storyboard view that is already connected to code. How can that existing already connected code be quickly looked up for a button?
The Storyboard Interface Builder has a Switch object that toggles between On, and Off, states. While in the On state it looks normal. While in the Off state it is grayed out.
I have a use for this object for the user to toggle between two options which are other than On, and Off. For my intended use its Off state gray out is undesirable. Is there way to prevent this switch graying out in its Off state?
Here is a code snippet by which I attempted to set a neutral white balance:
func prepare(completionHandler: @escaping (Error?) -> Void) {
func createCaptureSession() {
self.captureSession = AVCaptureSession()
}
func configureCaptureDevices() throws {
let session = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: AVMediaType.video, position: .back)
let cameras = (session.devices.compactMap { $0 })
if cameras.isEmpty {
throw CameraControllerError.noCamerasAvailable
}
for camera in cameras {
try camera.lockForConfiguration()
camera.setWhiteBalanceModeLocked(
with whiteBalanceGains: AVCaptureDevice.WhiteBalanceGains( redGain: 1.0, greenGain: 1.0, blueGain: 1.0),
completionHandler handler: ((CMTime) -> Void)? = nil
) // Errors: Extra arguments at positions #2, #3 in call, Cannot find 'with' in scope
camera.whiteBalanceMode = .locked
camera.unlockForConfiguration()
}catch{
// To Do: Error handling here
}
}
My call to "AVCaptureDevice.WhiteBalanceGains()" gets errors. What is the correct way to do this?
The goal is to have the background, and foreground, colors show the current state of a UIButton on a storyboard file. The attempt to do that is done with this code:
var toggleLight: Bool = false
@IBAction func toggleLight(_ sender: UIButton) {
if( toggleLight ){
toggleLight = false
sender.baseBackgroundColor = UIColor.black
sender.baseForegroundColor = UIColor.white
}else{
toggleLight = true
sender.baseBackgroundColor = UIColor.white
sender.baseForegroundColor = UIColor.black
}
tableView.reloadData()
}
I get these errors:
Value of type 'UIButton' has no member 'baseBackgroundColor'
Value of type 'UIButton' has no member 'baseForegroundColor'
I do not understand this because in file "UIKit.UIButton" I see:
extension UIButton {
...
public var baseForegroundColor: UIColor?
public var baseBackgroundColor: UIColor?
...
}
What has gone wrong there?
I have a warning message that the method AVCapturePhotoOutput.dngPhotoDataRepresentation is deprecated. What has this method been superseded by?
I tried various ways to instantiate the class: AVCapturePhotoSettings as listed below:
let settings = AVCapturePhotoSettings(format: [ kCVPixelBufferPixelFormatTypeKey : "BGRA"] )
let settings = AVCapturePhotoSettings(format: [ kCVPixelBufferPixelFormatTypeKey as String : "BGRA"] )
let settings = AVCapturePhotoSettings(format: [ String( kCVPixelBufferPixelFormatTypeKey) : "BGRA"]
let settings = AVCapturePhotoSettings(format: [ "kCVPixelBufferPixelFormatTypeKey" : "BGRA"] )
The first method gets the error:
Cannot convert value of type 'CFString' to expected dictionary key type 'String'
and so the three other attempts below this were done, but each of these attempts got the run time error:
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** +[AVCapturePhotoSettings photoSettingsWithFormat:] Either kCVPixelBufferPixelFormatTypeKey or AVVideoCodecKey must be specified'
What is the correct way to use this initiator for the class: AVCapturePhotoSettings ?
The need is to capture an image in an uncompressed format, and save it in a lossless file format. I am not particular about which uncompressed format. However AVCapturePhotoOutput.availablePhotoPixelFormatTypes shows that on my iPhone 14 Pro the BGRA format is available, and its type value is: 1111970369. So I use this code to get a settings object, and trigger a photo capture with it:
let settings = AVCapturePhotoSettings(format: [ String( kCVPixelBufferPixelFormatTypeKey ) : 1111970369] )
self.photoOutput?.capturePhoto(with: settings, delegate: self )
I am not sure I have the AVCapturePhotoSettings() dictionary argument right. It is the best I could conclude from available documentation, and it does not produce a compile time error. I found no example how to do uncompress photo files, I found only one that produces jpegs, and I have successfully used that example to capture images in jpeg format. This is the URL for that example which I now attempt to modify to produce uncompressed image files: https://www.appcoda.com/avfoundation-swift-guide/ . This example is somewhat out of date, and some method names had to be changed accordingly.
Next, when the system calls the delegated callback:
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?)
I need to convert the passed photo.pixelBuffer image object into one I can pass to:
self.photoCaptureCompletionBlock?(image, nil)
Class UIImage(data: data) çannot do this alone. I get a compile time error when I pass photo.pixelBuffer to it. I tried to convert with the AVCapturePhotoOutput() class, but could not a find version of its initiators the compiler would accept to do this.
What do I need to do to get the data in photo.pixelBuffer stored in an uncompressed file?
The need is to control the cameras, and LIDAR, on up to four iPhone 14 Pros from at Mac Studio they are connected to. The data from the iPhones is to be uploaded from the iPhone 14s to the MacStudio for processing. What I need help with is:
How can the iPhone 14s connected to MacStudio be enumerated on the Mac Studio to know how many there, are and their communication handles?
How can the camera application that takes the photos, and the other that takes LIDAR data, be communicated with over the Lighting Cables to send commands from the Mac Studio, and to transfer the image, and lidar, data files?
1 The iPhones will have to route commands sent over lighting cables from the Mac Studio to the appropriate application for camera, and LIDAR. What is the best way to do this routing?
To save time I will be basing the software on the iPhones on these sample projects which I will modify to meet the project's needs:
https://developer.apple.com/documentation/avfoundation/capture_setup/avcam_building_a_camera_app
https://developer.apple.com/documentation/avfoundation/additional_data_capture/capturing_depth_using_the_lidar_camera
Parameters I need to directly control from the Mac Studio are exposure times, F stops, which of the iPhones will flash. Each connected iPhone's camera must be triggered as simultaneously as possible.
I am very new to Swift, and OSX. I have read the Swift manual, and done the iOS App Dev Tutorials.
In this thread, eskimo posted code containing the line:
listener.service = .init(type: "_ssh._tcp")
I see in this document the type parameter is a service string, where its first substring identifies the application protocol, and its second identifies the transport protocol.
Where is it documented what the valid NWListener.listener.service string options are for these application, and transport, protocols?
These are download links to zip files that contain Xcode 14.3 project files which are my attempt to establish USB communications with an iPhone 14 connected by lightning cable to a MacStudio:
https://www.mediafire.com/file/k3my6y94iyjobeq/Bonjour-Trial-iPhone.zip/file
https://www.mediafire.com/file/cof3b3w9tru1jd0/Bonjour-Trial-Mac-Enumeration.zip/file
I could not attach them here. This web interface's file browser had these files grayed out, so it was necessary to make them available on a file sharing site.
The Bonjour-Trial-iPhone.zip project files run on the iPhone 14. The Bonjour-Trial-Mac-Enumeration.zip run on the MacStudio.
I have the iPhone 14 project code working on the iPhone. I expect it should be advertising its presence.
MacStudio project runs, but does not find the iPhone 14. When I trace execution in file Bonjour-Trial-Mac-Enumeration.swift the "results" array is empty on line 65.
What is going wrong here? Is a type "_ssh._tcp" connection possible between a MacStudio, and an iPhone, over the lightning cable?
The Info.plist file is missing from my project as shown in this screenshot:
Yet the project has properties as the screenshot does show. So obviously the properties are being remembered somewhere, and I do not know where. As can be seen no paths to an Info.plist is currently set.
I need to create a new Info.plist file, and add it to the project. My concern now is about synchronization. If I were to do this by navigating to: File => New => File => Resource => Property List, and named it Info.plist, will this new property list file automatically synchronized with all of my current settings? Or would it contain only default settings, and synchronize my project's settings with those default settings?
I suspect the original Info.plist file was lost when I decided rename folders, and filenames, and had to copy everything into a new project to do that. I failed to copy over the Info.plist file.