I am trying to make an Image Classifier but I keep getting a warning 'init()' is deprecated: Use init(configuration:) instead and handle errors appropriately. I was wondering if it matters because the app gets made and the classifier works. Just wondering what the warning means.
Thanks in advance!
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello, I have an object detection model that I integrated into an app. When I put an image on the preview for the Object Detection File, it classifies the image correctly. However, if I put the same image onto the app, it classifies it differently with different values. I am confused as to how this is happening. Here is my code:
import UIKit
import CoreML
import Vision
import ImageIO
class SecondViewController: UIViewController, UINavigationControllerDelegate {
@IBOutlet weak var photoImageView: UIImageView!
lazy var detectionRequest: VNCoreMLRequest = {
do {
let model = try VNCoreMLModel(for: EarDetection2().model)
let request = VNCoreMLRequest(model: model, completionHandler: { [weak self] request, error in
self?.processDetections(for: request, error: error)
})
request.imageCropAndScaleOption = .scaleFit
return request
} catch {
fatalError("Failed to load Vision ML model: \(error)")
}
}()
@IBAction func testPhoto(_ sender: UIButton) {
let vc = UIImagePickerController()
vc.sourceType = .photoLibrary
vc.delegate = self
present(vc, animated: true)
}
@IBOutlet weak var results: UILabel!
func updateDetections(for image: UIImage) {
let orientation = CGImagePropertyOrientation(rawValue: UInt32(image.imageOrientation.rawValue))
guard let ciImage = CIImage(image: image) else { fatalError("Unable to create \(CIImage.self) from \(image).") }
DispatchQueue.global(qos: .userInitiated).async {
let handler = VNImageRequestHandler(ciImage: ciImage, orientation: orientation!)
do {
try handler.perform([self.detectionRequest])
} catch {
print("Failed to perform detection.\n\(error.localizedDescription)")
}
}
}
func processDetections(for request: VNRequest, error: Error?) {
DispatchQueue.main.async {
guard let results = request.results else {
print("Unable to detect anything.\n\(error!.localizedDescription)")
return
}
let detections = results as! [VNRecognizedObjectObservation]
self.drawDetectionsOnPreview(detections: detections)
}
}
func drawDetectionsOnPreview(detections: [VNRecognizedObjectObservation]) {
guard let image = self.photoImageView?.image else {
return
}
let imageSize = image.size
let scale: CGFloat = 0
UIGraphicsBeginImageContextWithOptions(imageSize, false, scale)
for detection in detections {
image.draw(at: CGPoint.zero)
print(detection.labels.map({"\($0.identifier) confidence: \($0.confidence)"}).joined(separator: "\n"))
print("------------")
results.text = (detection.labels.map({"\($0.identifier) confidence: \($0.confidence)"}).joined(separator: "\n"))
// The coordinates are normalized to the dimensions of the processed image, with the origin at the image's lower-left corner.
let boundingBox = detection.boundingBox
let rectangle = CGRect(x: boundingBox.minX*image.size.width, y: (1-boundingBox.minY-boundingBox.height)*image.size.height, width: boundingBox.width*image.size.width, height: boundingBox.height*image.size.height)
UIColor(red: 0, green: 1, blue: 0, alpha: 0.4).setFill()
UIRectFillUsingBlendMode(rectangle, CGBlendMode.normal)
}
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
self.photoImageView?.image = newImage
}
}
extension SecondViewController: UIImagePickerControllerDelegate {
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
picker.dismiss(animated: true)
guard let image = info[.originalImage] as? UIImage else {
return
}
self.photoImageView?.image = image
updateDetections(for: image)
}
}
I attached pictures of the model preview and the app preview (it may be hard to tell but they are the same image). I have also attached pictures of my files and storyboard.
Any help would be great!
Thanks in advance!
I have been trying to figure out how to connect an iPhone to an external camera over Wifi. The phone connects to the camera over wifi, but I am confused about how to print the data onto an Xcode application. Is this even possible? Any help would be appreciated. I want to stream the video on a custom xcode app.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
External Accessory
Xcode
Camera
AVFoundation
I have been trying to figure out how to connect an iPhone to an external camera over Wifi. The phone connects to the camera over wifi, but I am confused about how to display the camera output onto a custom-made application. Is this even possible? Any help would be appreciated. I essentially want to stream the video and take pictures on the custom-made app on xcode.