Post

Replies

Boosts

Views

Activity

how to get real time camera frame in swiftui
I want to get real time camera frames to apply machine learning in swiftui. i made camera app with swiftui like this. but, i don't know how to get camera frame and how to apply machine learning techniques to camera frames struct ImagePicker: UIViewControllerRepresentable {       var sourceType: UIImagePickerController.SourceType = .camera       func makeUIViewController(context: UIViewControllerRepresentableContext) -> UIImagePickerController {           let imagePicker = UIImagePickerController()         imagePicker.allowsEditing = false         imagePicker.sourceType = sourceType           return imagePicker     }       func updateUIViewController(_ uiViewController: UIImagePickerController, context: UIViewControllerRepresentableContext) {       } }
0
0
918
Jun ’22
Can I participate in swift student challenge as a soldier?
I'm sorry I don't speak English well, so I use a translator so sentences may be awkward. I am a Korean college student who likes swift. Most Korean men go to the military. So next year I will take a leave of absence from university and I will be one of them. But, I would like to participate in the annual WWDC swift student challenge. Is it possible for me to participate and win? If I take a leave of absence and participate as a soldier, is it impossible to receive the award?
0
0
996
Jun ’22
Can i participate in swift student challenge as a soldier?
I'm sorry I don't speak English well, so I use a translator so sentences may be awkward. I am a Korean college student who likes swift. Most Korean men go to the military. So next year I will take a leave of absence from university and I will be one of them. But, I would like to participate in the annual WWDC swift student challenge. Is it possible for me to participate and win? If I take a leave of absence and participate as a soldier, is it impossible to receive the award?
1
0
1.2k
Jun ’22
I wonder why operator overriding is like this.
I wonder how the + function is used directly. extension CGPoint {     static func add(lhs: Self, rhs: Self) -> CGPoint {         CGPoint(x: lhs.x + rhs.x, y: lhs.y + rhs.y)     }     static func +(lhs: Self, rhs: Self) -> CGPoint {         CGPoint(x: lhs.x + rhs.x, y: lhs.y + rhs.y)     } } private func testCGPoint() {     let dummy1 = CGPoint(x: 2, y: 3)     let dummy2 = CGPoint(x: 4, y: 5)     let dummy3: CGPoint = .add(lhs: dummy1, rhs: dummy2)     let dummy4: CGPoint = dummy1 + dummy2 } in my expectation, I thought that the + function would be used like the add function. i thought this form. CGPoint.+(lhs: dummy1, rhs: dummy2) But the + function did not. How are you doing this?
1
0
1k
Jul ’22
How to get assets used in ARKit?
I am a student studying AR with ARkit. I want a variety of assets that can be used commercially. (there are limits to what I can do myself...) Is there any place where I can get assets(like 3Dmodels, usdz file etc.) like Unity's asset store? The resources on the Apple developer homepage are good but few, and don't know if they are commercially available.
1
0
1.3k
Dec ’22
how to use back camera in AVCapture?
i use this code in my app // MARK: - Protocol protocol VideoManagerProtocol: AnyObject {   func didReceive(sampleBuffer: CMSampleBuffer) } final class VideoManager: NSObject {       // MARK: -- Properties       /// RequestPermissionCompletionHandler   typealias RequestPermissionCompletionHandler = ((_ accessGranted: Bool) -> Void)       /// delegate of VideoManager   weak var delegate: VideoManagerProtocol?       /// An object that manages capture activity.   let captureSession = AVCaptureSession()       /// A device that provides input (such as audio or video) for capture sessions. //  let videoDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front) // choose deviceType, position is better but don't know ipad camera so choose default saftely   let videoDevice = AVCaptureDevice.default(for: .video)   /// A Core Animation layer that displays the video as it’s captured.   lazy var videoLayer: AVCaptureVideoPreviewLayer = {     return AVCaptureVideoPreviewLayer(session: captureSession)   }()       /// A capture output that records video and provides access to video frames for processing.   lazy var videoOutput: AVCaptureVideoDataOutput = {     let output = AVCaptureVideoDataOutput()     let queue = DispatchQueue(label: "VideoOutput", attributes: DispatchQueue.Attributes.concurrent, autoreleaseFrequency: DispatchQueue.AutoreleaseFrequency.inherit)     output.setSampleBufferDelegate(self, queue: queue)     output.alwaysDiscardsLateVideoFrames = true     return output   }()       // MARK: -- Methods   override init() {     guard let videoDevice = videoDevice, let videoInput = try? AVCaptureDeviceInput(device: videoDevice) else {       fatalError("No `Video Device` detected!")     }           super.init()           captureSession.addInput(videoInput)     captureSession.addOutput(videoOutput)   }       func startVideoCapturing() {     self.captureSession.startRunning() // do not operate in dispatch global background //    DispatchQueue.global(qos: .background).async { //      self.captureSession.startRunning() //    }   }       func stopVideoCapturing() {     captureSession.stopRunning()   }       func requestPermission(completion: @escaping RequestPermissionCompletionHandler) {     AVCaptureDevice.requestAccess(for: .video) { (accessGranted) in       completion(accessGranted)     }   } } extension VideoManager {     } // MARK: - AVCaptureVideoDataOutputSampleBufferDelegate extension VideoManager: AVCaptureVideoDataOutputSampleBufferDelegate {       func captureOutput(_ output: AVCaptureOutput, didDrop sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {     delegate?.didReceive(sampleBuffer: sampleBuffer)   } } i tried like this AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) The camera uses only the front camera. how to use back camera?
0
0
1.6k
Jan ’23