Post

Replies

Boosts

Views

Activity

Control system video effects support for CMIO extension
We're distributing a virtual camera with our app that does not profit in the slightest from automatically applied system video effects both to the video going in (physical camera device) or out (virtual camera device). I'm aware of setting NSCameraReactionEffectGesturesEnabledDefault in Info.plist and determining active video effects via AVCaptureDevice API. Those are obviously crutches, because having to tell users to go look for and click around in menu bar apps is the opposite of a great UX. To make our product's video output more deterministic, I'm looking for a way to tell the CMIO subsystem that our virtual camera does not support any of the system video effects. I'm seeing properties like AVCaptureDevice.Format.isPortraitEffectSupported and AVCaptureDevice.Format.isStudioLightSupported whose documentation refers to the format's ability to support these effects. Since we're setting a CMFormatDescription via CMIOExtensionStreamSource.formats I was hoping to find something in the extensions, but wasn't successful so far. Can this be done?
2
0
153
Nov ’25
Can't center entity on AnchorEntity(.plane)
How can entities be centered on a plane AnchorEntity? On top of the pure existence of the box's offset from the anchor's center, the offset also varies depending on the user's location in the space when the app is being started. This is my code: struct ImmersiveView: View { var body: some View { RealityView { content in let wall = AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: [2.0, 1.5]), trackingMode: .continuous) let mesh = MeshResource.generateBox(size: 0.3) let box = ModelEntity(mesh: mesh, materials: [SimpleMaterial(color: .green, isMetallic: false)]) box.setParent(wall) content.add(wall) } } } With PlaneDetectionProvider being unavailable on the simulator, I currently don't see a different way to set up entities at least somewhat consistently at anchors in full space.
1
0
1k
Sep ’23
macOS: How to map AVAudioDevice audio inputs to AVCaptureMovieFileOutput output tracks?
The Situation I'm on macOS and I have an AVCaptureSession with camera and audio device inputs which are fed into an AVCaptureMovieFileOutput. What I am looking for is a way to map audio device input channels to file output audio channels, preferably using an explicit channel map. By default, AVCaptureMovieFileOutput takes (presumably) the maximum number of input channels from an audio device that matches an audio format supported by the capture output, and records all of them. This works as expected for mono devices like the built-in microphone and stereo USB mics, the result being either a 1ch mono or a 2ch stereo audio track in the recorded media file. However, the user experience breaks down for 2ch input devices that have an input signal on only one channel, which is reasonable for a 2ch audio interface with one mic connected. This produces a stereo track with the one input channel panned hard to one side. It gets even weirder for multichannel interfaces. For example, an 8ch audio input device results in a 7.1 audio track in the recorded media file with input audio mapped to separate tracks. This is far from ideal during playback, where audio sources are surprisingly coming from seemingly random directions. The Favored Solution Ideally, users should be able to select which channels of their audio input device will be mapped to which audio channel in the recorded media file via UI. The resulting channel map would be configured somewhere on the capture session. The Workaround I have found that AVCaptureFileOutput does not respond well to channel layouts that are not standard audio formats like mono, stereo, quadrophonic, 5.1, and 7.1. This means, channel descriptions and channel bitmaps are out of the question. What does work, is configuring the output with one of the supported channel layouts and disabling audio channels via AVCaptureConnection. With that, the output's encoder produces reasonable results for mono and stereo input devices, if the configured channel layout is kAudioChannelLayoutTag_Stereo, but anything else is mixed down to mono. I am somewhat sympathetic to this solution in so far that in lieu of an explicit channel map the best guess the audio encoder could make, is mixing every enabled channel down to mono. But, as described above, this breaks for 2ch input devices where only one channel is connected to a signal source. The result is a stereo track with audio hard panned to one side. The Question Is there a way to implement the described favored solution with AVCapture* API only, and if not, what's the preferred way of dealing with this scenario - going directly for AVAudioEngine and AVAssetWriter?
1
0
1k
Jun ’23
App intent with parameter launches app before taking user input
I built a couple of app intents for macOS, which generally work great. However, I'm struggling with configuring an app intent that takes a parameter, so that it doesn't require the app to launch before presenting people with the list of options. If the app is running and I run the intent in Spotlight, I can see the message defined by the intent's parameterSummary and I can select a parameter from the list of entities. If the app is not running, it is launched first and only then the intent message fully populates in Spotlight and allows parameter selection. What I've tried: Support background or deferred mode in the intent. Conformed the entities to IndexedEntity. Conformed the entity query to EnumerableEntityQuery, implementing suggestedEntities and allEntities. Conformed the entity query to EntityStringQuery. Donated the intent to Spotlight on app launch. Donated the entities to Spotlight on app launch, both using indexSearchableItems and indexAppEntities. Not sure if both are required or if the latter is just a more convenient version of the former. Do I have to conform to or implement something else? Do I need to work with an app intent extension? If so, would I put all app intent code into the extension instead of the main app? Is this a system bug I should file?
0
0
52
3w