Post

Replies

Boosts

Views

Activity

iOS 15 Vision person segmentation
I tried using sample code "Applying Matte Effects to People in Images and Videos" on iPhone 12 mini, but it's not accurate near the boundaries (especially hair). I even tried .accurate mode in segmentation quality level that causes iPhone to overheat quickly but still segmentation is not good for live video. One thing that may matter is results of segmentation are not as good as matting which applies alpha channel for the hair to blend accurately with the background. But if I am missing something, please do point out.
0
0
929
Jun ’21
XCode 12.5 abort trap 6 on building(after downgrading)
I had a project that was created on XCode 12. Next I opened it in XCode 13 beta 5 and made lot of edits. Now it does not build at all in XCode 12.5. I tried clean build but still doesn't work. Command CompileSwift failed with a nonzero exit code 1. Apple Swift version 5.4 (swiftlang-1205.0.26.9 clang-1205.0.19.55) 2. Running pass 'Module Verifier' on function '@"xxxxxxxxxxxxxxxxH0OSo014AVAsynchronousA18CompositionRequestCtF"' 0  swift-frontend           0x000000010796fe85 llvm::sys::PrintStackTrace(llvm::raw_ostream&) + 37 1  swift-frontend           0x000000010796ee78 llvm::sys::RunSignalHandlers() + 248 2  swift-frontend           0x0000000107970446 SignalHandler(int) + 262 3  libsystem_platform.dylib 0x00007fff204fbd7d _sigtramp + 29 4  libdyld.dylib            0x00007fff204d0ce8 _dyld_fast_stub_entry(void*, long) + 65 5  libsystem_c.dylib        0x00007fff2040b406 abort + 125 6  swift-frontend           0x0000000102b92a31 swift::performFrontend(llvm::ArrayRef<char const*>, char const*, void*, swift::FrontendObserver*)::$_1::__invoke(void*, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, bool) + 1169 7  swift-frontend           0x00000001078c52d0 llvm::report_fatal_error(llvm::Twine const&, bool) + 288 8  swift-frontend           0x00000001078c51ab llvm::report_fatal_error(char const*, bool) + 43 9  swift-frontend           0x000000010786537f (anonymous namespace)::VerifierLegacyPass::runOnFunction(llvm::Function&) + 111 10 swift-frontend           0x00000001077ff0b9 llvm::FPPassManager::runOnFunction(llvm::Function&) + 1353 11 swift-frontend           0x00000001077fe3a0 llvm::legacy::FunctionPassManagerImpl::run(llvm::Function&) + 112 12 swift-frontend           0x0000000107805835 llvm::legacy::FunctionPassManager::run(llvm::Function&) + 341 13 swift-frontend           0x0000000102f3e3e8 swift::performLLVMOptimizations(swift::IRGenOptions const&, llvm::Module*, llvm::TargetMachine*) + 1688 14 swift-frontend           0x0000000102f3f486 swift::performLLVM(swift::IRGenOptions const&, swift::DiagnosticEngine&, llvm::sys::SmartMutex<false>*, llvm::GlobalVariable*, llvm::Module*, llvm::TargetMachine*, llvm::StringRef, swift::UnifiedStatsReporter*) + 2582 15 swift-frontend           0x0000000102b9e863 performCompileStepsPostSILGen(swift::CompilerInstance&, std::__1::unique_ptr<swift::SILModule, std::__1::default_delete<swift::SILModule> >, llvm::PointerUnion<swift::ModuleDecl*, swift::SourceFile*>, swift::PrimarySpecificPaths const&, int&, swift::FrontendObserver*) + 3683 16 swift-frontend           0x0000000102b8fd22 swift::performFrontend(llvm::ArrayRef<char const*>, char const*, void*, swift::FrontendObserver*) + 6370 17 swift-frontend           0x0000000102b11e82 main + 1266 18 libdyld.dylib            0x00007fff204d1f3d start + 1 error: Abort trap: 6 (in target 'VideoEditing' from project 'VideoEditing')
0
0
374
Sep ’21
AVPlayer with customCompositor sometimes crashes on seeking
I have a AVVideoComposition with customCompositor. The issue is sometimes AVPlayer crashes on seeking, especially when seektolerance is set to CMTime.zero. The reason for crash is request.sourceFrame(byTrackID: trackId) returns nil even though it should not. Below is a sample of 3 instructions and time ranges, and all contain only track 1. 2021-09-09 12:27:50.773825+0400 VideoApp[86227:6913831] Instruction 0.0, 4.0 2021-09-09 12:27:50.774105+0400 VideoApp[86227:6913831] ...Present TrackId 1 in this instruction 2021-09-09 12:27:50.774196+0400 VideoApp[86227:6913831] Instruction 4.0, 5.0 2021-09-09 12:27:50.774258+0400 VideoApp[86227:6913831] ...Present TrackId 1 in this instruction 2021-09-09 12:27:50.774312+0400 VideoApp[86227:6913831] ...Present TrackId 1 in this instruction 2021-09-09 12:27:50.774369+0400 VideoApp[86227:6913831] Instruction 5.0, 18.845 2021-09-09 12:27:50.774426+0400 VideoApp[86227:6913831] ...Present TrackId 1 in this instruction VideoApp /VideoEditingCompositor.swift:141: Fatal error: No pixel buffer for track 1, 4.331427 Here is the simple line of code that produces this error:  guard let pixelBuffer = request.sourceFrame(byTrackID: trackId) else {                 fatalError("No pixel buffer for track \(trackId), \(request.compositionTime.seconds)")              } As can be seen, time 4.331427 seconds is very much in time limits of second instruction that runs from 4.0 seconds to 5.0 seconds. Why the custom compositor returns nil pixel buffer then? And the times are random (the time values for crash keep changing), next time i run the program to specifically seek at this time, it does return valid pixel buffer! So it has nothing to do with particular time instant. Also playback is totally fine (without seeking). It's something that is to do with AVFoundation framework than the app. Has anyone seen such an error ever?
0
0
505
Sep ’21
Migrating from UIScrollView to UICollectionView
I have prototyped multilayer timeline with custom cells, where: a. Each cell has possibly different size. Some cell sizes can be more than visible rect of ScrollView, b. The gap between cells may be different (even though it appears same in the picture below), except the first(base) layer where the cell gap is fixed to 2 points, c. Each cell can be selected and trimmed/expanded from each end using UIPanGestureRecognizer. Trimming/Expansion have custom rules. For the base layer, cell simply pushes other cells as it expands or contracts. For other layers however, the trimming or expansion have to respect boundaries of neighbouring cells. d. Timeline can be zoomed horizontally which has the effect of scaling cells e. Cells can be dragged and dropped to other rows subject to custom rules. I have implemented all this using UIScrollView. By default all cells are initialized and added to UIScrollView, whether they are visible or not. But now I am hitting limits as I draw more content on each cell. Which means I need to reuse cells and draw only visible content. I discussed this with Apple Engineers in WWDC labs and one of the engineer suggested I use UICollectionView with custom layout where I can get lot of functionality for free (such as cell reuse, drag and drop). He suggested me looking into WWDC 2018 video (session 225) on UICollectionView. But as I look at custom layout of UICollectionView, it's not clear to me: Q1. How to manually trim/expand select cells in UICollectionView with custom layout using UIPanGesture? In case of UIScrollView, I just have a UIPanGestureRecognizer on cell and do the trimming and expansion of it's frame (respecting given boundary conditions). Q2. How to scale all the cells with a given zoom factor? With UIScrollView, I simply scale the frames of each cell and then calculate contentOffset to reposition UIScrollView around the point of zoom. Even with UICollectionView with just one cell which has width say 10x of UICollectionView frame width, I will need further optimization to draw content on only visible portion rather than the whole cell. How is that possible with UICollectionViewCell to draw only part of the cell that's visible on screen?
Topic: UI Frameworks SubTopic: UIKit Tags:
0
0
477
Sep ’21
AVCaptureDevice isCenterStageEnabled KVO not fired
@AVFoundationEngineers I am trying to observe isCenterStageEnabled property as follows: AVCaptureDevice.self.addObserver(self, forKeyPath: "isCenterStageEnabled", options: [.initial, .new], context: &CapturePipeline.centerStageContext) I have set the centerStageControlMode to .cooperative. The KVO fires only when I do make changes to property AVCaptureDevice.isCenterStageEnabled in my code. KVO is NOT fired when the user toggles the centerStage property from Control Center. Is this a bug?
0
0
530
Oct ’21
CIColorCube load lut data
I have a .cube file storing LUT data, such as this: TITLE "Cool LUT" LUT_3D_SIZE 64 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0157 0.0000 0.0000 0.0353 0.0000 0.0000 My question is how do I load required NSData that can be used in CIColorCube filter? When using Metal, I convert this data into MTLTexture using AdobeLUTParser. Not sure what to do in case of CoreImage.
0
0
916
Dec ’21
AVVideoComposition startRequest early return without calling finishWithComposedVideoFrame
@AVFoundationEngineers I browsed through AVCustomEdit sample code and notice the following: - (void)startVideoCompositionRequest:(AVAsynchronousVideoCompositionRequest *)request { @autoreleasepool { dispatch_async(_renderingQueue,^() { // Check if all pending requests have been cancelled if (_shouldCancelAllRequests) { [request finishCancelledRequest]; } else { NSError *err = nil; // Get the next rendererd pixel buffer CVPixelBufferRef resultPixels = [self newRenderedPixelBufferForRequest:request error:&err]; if (resultPixels) { // The resulting pixelbuffer from OpenGL renderer is passed along to the request [request finishWithComposedVideoFrame:resultPixels]; CFRelease(resultPixels); } else { [request finishWithError:err]; } } } } The startVideoComposition request returns early without calling finishWithComposedVideoFrame as the processing takes place asynchronously in a dispatchQueue. However, the documentation of startVideoCompositionRequest states the following: Note that if the custom compositor's implementation of -startVideoCompositionRequest: returns without finishing the composition immediately, it may be invoked again with another composition request before the prior request is finished; therefore in such cases the custom compositor should be prepared to manage multiple composition requests. But I don't see anything in the code that is prepared to handle multiple composition requests of same video frame. How is one supposed to handle this case?
0
0
581
Jan ’22
Combining two apps in XCode
I have two XCode workspaces, both the workspaces have multiple targets. Each target in both the workspaces have storyboard file called Main.storyboard. My problem is to combine one target in the first workspace with a second in another. What is the right approach to merge the targets?
0
0
402
Feb ’22
Urgent: XCode git remove files of another project
I dragged a group of files from one XCode project to another. I noticed XCode copied the whole project folder instead of the selected group of files. I immediately selected the copied project folder and deleted it, and selected "Remove References only" as I had a bad experience earlier where I chose to delete the files (in which case the files got deleted from the original project). But now when I commit the changes to git, it shows me list of 1300 files from the original project to commit, even though they are not there in the project. I search the all the project subfolders and those files are no where in the project. What do I do for git to forget those files?
0
0
602
Feb ’22
How to debug crash - Thread 1: EXC_BAD_INSTRUCTION (code=1, subcode=0x40343276)
I have a strange crash on iOS device (EXC_BAD_INSTRUCTION). I have a custom UIControl called ScrollingScrubber and all it has is a UIScrollView, and it fires .valueChanged events when user is scrolling. That's where sometimes the crash happens and I have no idea how to debug it further. var value:CGFloat = 0 override open func beginTracking(_ touch: UITouch, with event: UIEvent?) -> Bool { isUserDragging = scroller.isDragging if isUserDragging { sendActions(for: .editingDidBegin) } return isUserDragging } override open func continueTracking(_ touch: UITouch, with event: UIEvent?) -> Bool { value = (scroller.contentOffset.y + scroller.contentInset.top)/(scroller.contentSize.height) sendActions(for: .valueChanged) //It sometimes crashes here with EXC_BAD_INSTRUCTION, why????? return true }
0
0
390
Apr ’22
iOS Settings app dark mode matching colors
I have a UITableViewController with a grouped table view. No matter what I try, I can't match dark mode colors of native Settings app of iOS 14. I tried the following: self.tableView.backgroundColor = UIColor.systemGroupedBackground And in cellForItemAtIndexPath, I set cell.backgroundColor = UIColor.secondarySystemGroupedBackground This matches colors for light mode but not for dark mode.
Topic: UI Frameworks SubTopic: UIKit Tags:
0
0
365
Apr ’22
External display window switch at runtime programmatically
I display my view to external display using UIScene as follows by selecting an appropriate UISceneConfiguration: // MARK: UISceneSession Lifecycle     @available(iOS 13.0, *)     func application(_ application: UIApplication, configurationForConnecting connectingSceneSession: UISceneSession, options: UIScene.ConnectionOptions) -> UISceneConfiguration {         // Called when a new scene session is being created.         // Use this method to select a configuration to create the new scene with.        // return UISceneConfiguration(name: "Default Configuration", sessionRole: connectingSceneSession.role)         // This is not necessary; however, I found it useful for debugging                switch connectingSceneSession.role {                    case  .windowApplication:                        return UISceneConfiguration(name: "Default Configuration", sessionRole: connectingSceneSession.role)                    case .windowExternalDisplay:                        return UISceneConfiguration(name: "External Screen", sessionRole: connectingSceneSession.role)                    default:                        fatalError("Unknown Configuration \(connectingSceneSession.role.rawValue)")                    }     } The above API is called automatically when external screen is connected/disconnected. My question is whether there is anyway or API that disables/enables external screen display at runtime (without user disconnecting the HDMI cable)?
Topic: UI Frameworks SubTopic: UIKit Tags:
0
0
934
May ’22