Hello!
I was faced with unexpected behavior of hardware keyboard focus in UITests.
A clear description of the problem
When running UITests on the iOS Simulator with both "Full Keyboard Access" and "Connect Hardware Keyboard" options enabled, there is a noticeable delay between keyboard actions for focus managing (like pressing Tab or arrow keys). The delay seems to increase with repeated input and suggests that events are being queued instead of processed immediately.
I will describe why I have such an assumption later.
A step-by-step set of instructions to reproduce the problem
Launch the iOS Simulator.
Enable both "Full Keyboard Access" and "Connect Hardware Keyboard" in the Simulator settings.
Run a UITest on a target application (ideally an endless or long-running test).
Once the app is launched, press the Tab key several times.
Observe the delay in focus movement.
Optionally, press the Tab or arrow keys rapidly, then stop the UITest.
After stopping, you’ll see a burst of rapid focus changes.
What results you expected
We expected keyboard actions (like Tab) to be handled immediately and the UI focus to update smoothly during UITests.
What results you saw
There was a 4–10 (end more) second delay between pressing keys and seeing a response. All stacked keyboard events (used for managing focus) are performed all at once after stopping the UITest.
The version of Xcode you are using
Xcode: Version 16.3 (16E140)
Simulator: iPhone 16 Pro (iOS 18.4 and 18.1)
Simulator: iPad Pro 11-inch (M4) (iPadOS 17.5)
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have a question about the content of the custom NSDockTile view.
For example, I have several CALayers figures (pic 1)
Initial information:
First of all, I added green figure L on contentView's layer.
Then I added figures R1 and R2. R1 and R2 are the same but the R1 figure has a Y offset (just several points).
So I expect that R1 and R2 will cover green figure L.
But because of interpolation, I can have an effect of the wrong overlay. I mean the case when figure L seems to cover figure R1.
I have a custom view that I set here NSApp.dockTile.contentView
and use NSApp.dockTile.display() to update the icon.
pic 1 - figures
Question:
How can I manage the interpolation for the contentView?
Should I create my own NSImage and set it as a dock icon?
Or will I get some results at the end?
The main question is how can I fix or improve it.
Problem:
Need to create an app (General) that will start/stop and manage other apps (Secondary).
The general app will be only one, but secondary apps will be as many as user want.
As well the general app should communicate with secondary apps.
PS: The secondary app MUST start only as a separated instance app.
Question:
I want to know your thoughts, adviсes, and best practices for:
Should I create one Xcode project with two targets (General and Secondary)? Or should I create two different Xcode projects?
As for me: maybe one of the best solution will be the second one with using some ci/cd. What do you think?
How the general app should communicate with second apps?
As for me: Could Distributed Notifications helps in this situation?
Thank you, I will be glad for any information, word, or sources.
I want to make slow motion video more smoother, so as I understand I need to increase FPS for it.
As I know we can use ffmpeg library for it, but is there a native way to increase FPS for video?