In the example of the Core Media IO there is a mach server that is providing the samples to the plugin (in the example it is a getting the samples from a yuv file). The architecture of OBS is similar. The mach server is reading from a camera or from from a file and provides the samples to the plugin.
Why do we need the mach server? I've implemented a virtual webcam where the plugin is capturing samples from a webcam (either the built in webcam or a USB one). It seems to be more efficient though I guess that there is a reason for doing this with a mach server. What is the reason?