Good afternoon!
Can you advise, I want to implement photo exposure fixing by clicking on photo preview point, at the moment I use DragGesture to get CGPoint and pass it to capture setup
let device = self.videoDeviceInput.device
do {
try device.lockForConfiguration()
if device.isFocusPointOfInterestSupported {
device.exposurePointOfInterest = focusPoint
device.exposureMode = .autoExpose
device.unlockForConfiguration()
}
}
The values are printed to the terminal, but in the preview it feels like one point closer to the bottom edge is being used.
The code for the view is:
.gesture(
DragGesture(minimumDistance: 0)
.onChanged({ value in
self.expFactor = value.location
print(expFactor)
})
.onEnded({ value in
model.exp(with: expFactor)
})
Can you tell me if anyone has already tried to implement the fixing in SwiftUI, I want it roughly like the standard camera.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Good afternoon!
Can you tell me how to calculate the maximum speed of the user? Do you have an example of the implementation of the function?
And also tell me to calculate the distance traveled? Maybe someone has an example of implementation.
My application is a speedometer, used only Core Location.
Thank you!
Good afternoon!
Can you tell me how I can solve the problem?
In my app there are 4 color themes of the background (like in the standard Books app). In the standard mode is the status bar according to the system color scheme, which is not suitable for white background and white status bar and vice versa with black background.
At this point, the best I could think of and find is:
.onChange(of: store.colorStatusBar) { newValue in
UIApplication.shared.statusBarStyle = store.colorStatusBar
}
But Xcode writes that this code is out of date and displays a warning:
'statusBarStyle' was deprecated in iOS 13.0: Use the statusBarManager property of the window scene instead.
I haven't found a working example with statusBarManager.
Can you tell me how to redo it or fix it?