I have a image with pixel size: 3024 * 3024
I have a custom UIView with logical size: 207 * 207, and layer.contentsScale is 3, so pixel size is 621 * 621.
I want to draw this UIImage in my custom UIView, the code in draw(rect:) like this;
var image: UIImage? {
didSet {
setNeedDisplay()
}
}
override func draw(_ rect: CGRect) {
guard let ctx = UIGraphicsGetCurrentContext() else { return }
ctx.addRect(bounds)
ctx.setFillColor(UIColor.black.cgColor)
ctx.fillPath()
if var im = image {
let size = Math.getMaxSizeWithAspect(size: CGSize(width: bounds.width, height: bounds.height), radioWidthToHeight: im.size.width / im.size.height)
im.draw(in: CGRect(x: (bounds.width - size.width) / 2, y: (bounds.height - size.height) / 2, width: size.width, height: size.height))
}
and the result is very bad, picture is aliasing, I tried much solution, but not work well.
// not work list...
ctx.setShouldAntialias(true)
ctx.setAllowsAntialiasing(true)
ctx.interpolationQuality = .high
layer.allowsEdgeAntialiasing = true
layer.minificationFilter = .trilinear
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am using AVCaptureVideoDataOutput to capture each frame from camera capture.
I display these frame to MTKView.
that's fine.
If I call AVCapturePhotoOutput.capturePhoto, the delegate function AVCaptureVideoDataOutput.captureOutput will be paused, so that my preview had a little freezed, that's very bad.
I don't know if I do something wrong cause this?
this is my capturePhoto code:
func createTakePhotoSettingForNormalPhoto() - AVCapturePhotoSettings {
var settings: AVCapturePhotoSettings
settings.isHighResolutionPhotoEnabled = true
return settings
}
func takePhoto() {
settings = createTakePhotoSettingForNormalPhoto()
settings.flashMode = ...
_outputStillImage.capturePhoto(with: settings, delegate: self)
}
I noticed that if I tap shot button repeatly and quickly in IOS native Camera APP, it will have a black flash at each shot, but the preview also could display frame smoothly.
I want use Metal kernel function to calculate histogram of a video frame(texture)
some code in kernel function like this
float gray = grayScale(rgbValue);
int histogramIndex = int(clamp(gray, 0.0, 1.0) * 255);
histogram[histogramIndex] += 1;
it works, but there was some mistake, I sum up every item value in array histogram, the sum result is not equals to my texture pixel count. it's always less than total pixel count. so I think there must be a multi-thread sync problem.
so how could I sync the kernel function?
I know MPS provided a MPSImageHistogram to do this, It works well, I just want to calculate histogram on my pipeline.
Thanks
I got a RAW Image CGImage from file, its pixel format is CV14BayerRggb 1919379252
then, convert it to MTLTexture RGBA16Uint
then, modify some pixel color and write to another MTLTexture RGBA16Uint
then, I want to create a CIImage from output texture,
there raised a error told me create CIImage failed, unsupported texutre format.