Hi Albert,
Thanks for your quick reply! I have to say I was half expecting this to be the answer, but a small part of me hoped it would not be the case.
It makes so much sense to have these controls for external cameras too, in particular I have a "full screen" mode in my app, and being able to use the camera control button whilst in that mode for configuring things such as shutter speed, ISO, aperture, and for actuating the shutter would make it such a better user experience! Instead users have to jump in and out of full screen mode to adjust any of these shooting settings.
They are not particularly creative ways as far as I can see, they will just allow a more seamless interaction with the external camera over WiFi, make the shooting experience less cluttered, and allow users to use more of the screen real-estate for composition!
I have submitted an enhancement under FB15100641 requesting this.
I do have a further question, do you know how much of the AVCaptureSession process I would need to do in order to open up this functionality? And how much draw that would have on CPU and memory usage? I am half tempted to setup an AVCaptureDevice but just not render it in a preview layer in order to open up this functionality. Perhaps too you could speak as to whether that would lead to rejection at the review process stage? Perhaps there's someone I could speak to at Apple regarding this, to get it pre-approved or at least have a discussion around it?
Simon