is XPC from app to CMIOExtension possible?

Apple Recommended

  • @eskimo Where can we find sample code that shows us how to send sample buffers from a client using the old C style Core Media IO API to a camera extension that contains an output device?

Add a Comment


Is this to be expected?

I’m not really up to speed on CMIO stuff but this doesn’t surprise me. That’s a pattern I see in similar contexts.

According to the Daemons and Services Programming Guide an XPC service should have a CFBundlePackageType of XPC!, but a CMIOExtension is of type SYSX. It can't be both.

That doc is in the archive and hasn’t been updated to account for system extensions. Having said that there’s an ongoing terminological confusion when it comes to the term XPC services. Some folks use it to mean A small program embedded in an app, as defined in the xpcservice.plist man page. Other folks use it to mean An XPC listener with a published name. This name might be published by an XPC service, a launchd daemon or agent (via the MachServices property), a Service Management login item, a system extension, and so on.

I previously used XPC Service for the former and XPC service for the latter, but that’s way too subtle. I now use XPC service for the former and named XPC listener for the latter.


As to your connection error, two things:

  • Make sure your app isn’t sandboxed. There are ways to do this in a sandboxed app but during the bring up it’s best to just avoid the sandbox.

    If your main app is sandboxed, create a tiny test app just for this bring up.

  • As to your service name, use the following to see what actually got published in the endpoints list:

    % sudo launchctl print system

Share and Enjoy

Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + ""

  • thanks Quinn, especially for the magic incantation for launchctl (I had read the enormous man page, but couldn't figure out how to do the needful). There is no service published containing my company ID. My extension's main attempts to make an XPC listener using NSXPCListener *listener = [NSXPCListener serviceListener]; So it looks like the OS didn't do whatever it needs to do to publish a service. Which points to missing required entries in my plist. My app is not sandboxed.

Add a Comment

There is no service published containing my company ID.


I’m not sure what’s going on here. This issue stands at the intersection of XPC and CMIO and my knowledge of the latter is rather limited. My advice is that you open a DTS tech support incident and I’ll work with DTS’s CMIO specialist to get to the bottom of this.

Share and Enjoy

Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + ""

We're having the same problem converting our virtual camera to an extension - nothing we tried works as a way to communicate from the host application to the extension. I would assume that a full sample would be coming along at some point, but it's frustrating to get the extension template "working" quickly, then have no way to go from there to something useful.

  • My offer still stands.

  • I managed to get direct app to camera extension communication working, but I'm not sure I'm doing it right. The gist is this:

    You make a launch daemon which acts as a helper. The code lives in your app bundle, but the process started by launchd on your behalf will run as root. The helper makes a XPC listener on a Mach port with a global name, which can be found by your app and by the camera extension. You use the helper as a kind of mailbox to let the extension know where to find the app. This approach is mentioned at various places on this forum and elsewhere on the Internet, but never in great detail.

    In your app, find and connect to the helper. This is similar to connecting to a bundled XPC service, only you use initWithMachServiceName instead of initWithServiceName. Then make an anonymous listener, get its endpoint, and send that endpoint as a parameter to your helper. Store it in the helper.

    In your extension, find and connect to the helper, exactly the same way you do so from the app. Make an XPC call into the helper to fetch the endpoint which the app stored (in the reply block of your call). I don't know how to deal with synchronization issues here, I guess there is a slim chance that the extension could fetch the endpoint value at a time when it is in the process of being written by the app.

    Now, in the extension, build a connection around the endpoint you just received, with initWithListenerEndpoint. From the extension, make a call into your app - this will cause your app's listener:shouldAcceptNewConnection to be called and you have established a direct communication between the extension and the app, which of course can be used in both directions.

    There are a myriad of small things you have to get right - matching protocols at each end, allowing the extension to break out of its sandbox to look up a Mach service name with a properly formatted entitlement, ensuring the name of the service is prefixed by one of the app groups, the helper bundle ID has to be the same as its Mach service name etc etc.

    Also, I might be doing something wrong and making it all too complicated. The Camera Extension is nicely self-contained. All the app code has to do is ask to activate it. But the helper requires an installer, and it has to be at a known location so that the launchd plist can find it. Which restricts you to putting your app in /Applications, or to doing some heroics to write a plist with the correct absolute path to the bundle at run-time, and then somehow installing this newly-generated plist, all of which is very far from drag-and-drops simplicity.

    Also, the camera extension isn't replaced when you install a new one. _cmiodalassistants will stubbornly hang onto an instantiation of the older version until you reboot. I have done a lot of rebooting.

  • Super helpful thank you! We do the same thing in our app, we create a launch daemon that acts as a broker to enable XPC communication between our audio server plugin and our app.

Add a Comment

I'm not sure I'm doing it right.

That does sound very convoluted. I’m hoping to find a simpler way.

Someone [1] opened a TSI for this and I’ll post back here with the conclusion.

Share and Enjoy

Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + ""

[1] For privacy reasons I don’t have an easy way for me to map DevForums names to real names.

  • I'd like a simpler way too. I'd also like this forum to respect my line breaks in comments, instead it just smushes all my paragraphs into one giant block.

    I tried to send video from my app to the extension. If I (try to) send a IOSurfaceRef, I get an exception encodeDataAt:ofObjCType: unencodable type (^{__IOSurface=}) in my app. If I (try to) send an xpc_object_t derived from that IOSurfaceRef, using IOSurfaceCreateXPCObject, I get this error:

    <NSXPCConnection: 0x6000031e43c0> connection to service with pid 17661 created from an endpoint: Exception caught during decoding of received selector sendIOSurfaceToExtension:, dropping incoming message. Exception: Exception while decoding argument 0 (#2 of invocation): Exception: Attempt to decode an xpc type but whitelist does not specify an XPC type '<no key>'.

    Someone asked a similar question a year ago on Stack Overflow, the answer was "whitelist it", but whitelisting should only be necessary for objects in connection classes, no? I'm sending my frames one at a time, not in a collection. There's an API to set the XPC type on a parameter of an interface method - setXPCType:forSelector:argumentIndex:ofReply:, but I don't know what value to use. I thought it was supposed to Just Work™?

Add a Comment

I'd also like this forum to respect my line breaks in comments.

I recommend that you only use comments for… well… comments. Sadly, the current DevForums UI misleads folks into using them for replies. We plan to fix that.

As to how you transfer I/O Surface values over XPC, there are two ways to do that:

  • Directly

  • Using secure coding

The direct case is easier to explain so let’s start with that. Imagine this XPC protocol:

private protocol MainDirectSurfaceTransfer {
    func transfer(surface: xpc_object_t, reply: @escaping (_ error: NSError?) -> Void)

The trick is to configure your NSXPCInterface so that XPC knows what XPC object type to expect for the surface parameter. Here’s how to do that:

func surfaceTransferInterface() -> NSXPCInterface {
    let interface = NSXPCInterface(with: MainDirectSurfaceTransfer.self)
    // Create a dummy surface so that we can get its XPC object type.
    let dummy = MainDirect.makeSurface()!
    let dummyXPC = IOSurfaceCreateXPCObject(dummy)
    let type = xpc_get_type(dummyXPC)
    interface.setXPCType(type, for: #selector(MainDirectSurfaceTransfer.transfer(surface:reply:)), argumentIndex: 0, ofReply: false)

    return interface

Annoyingly, you have to create a dummy surface to get its XPC type because there’s no way to get it directly.

On the send side you call IOSurfaceCreateXPCObject to get the XPC surface from your object. And on the receive side you call IOSurfaceLookupFromXPCObject to do the reverse.

Share and Enjoy

Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + ""

I'll type this in the answer box then.

If you can require 10.12 or later, it is much simpler than this. As I found out, an IOSurfaceRef cannot be directly transported over NSXPCConnection, but an IOSurface can. No need to do the CreateXPCObject or LookupFromXPCObject stuff unless you're on 10.7 to 10.11. The two types are toll-free bridged. The only thing I'm not sure about is which flavor of __bridge to use at each end.

I'm thinking I need write

IOSurfaceRef surface = CVPixelBufferGetIOSurface(pixelBuffer);
[remoteObjectProxy sendIOSurfaceToExtension:(__bridge_transfer IOSurface *)surface]; 

and at the receiving end

CVPixelBufferRef pixelBuffer;
CFPixelbufferCreateWithIOSurface(kCFAllocatorDefault, (__bridge_retained IOSurfaceRef)surface,...);

I guess I'll find out...

As far as I can tell, CMIO system extensions run under the system user _cmiodalassistants, which does expose the XPC services correctly. You can verify this by checking the exposed endpoints for that user:

sudo launchctl print user/262

The problem, it seems, is that the user session (which launches the main application) does not have access to the endpoints exposed by _cmiodalassistants, resulting in the error you are seeing.

CMIO system extensions run under the system user _cmiodalassistants


which does expose the XPC services correctly

Well “correctly” is a matter of perspective (-: As I mentioned above, a developer opened a TSI about this issue. I’m actively discussing this with the CMIO team and I’ll post back here once I have a definitive answer.

Share and Enjoy

Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + ""

As we learned in the "Create camera extensions with CoreMedia IO" presentation at WWDC 2022, a Camera Extension can present a sink interface to the system, accessible to an app via the CMIOHardwareObject.h and CMIOHardwareStream.h APIs. The Camera Extension template only offers a source stream, but it is pretty easy to add a sink stream.

That's one way to get video into a Camera Extension, it works on 12.3 and it doesn't require a helper daemon to enable the app and extension to find one another.

In Ventura, the plan is to have the app's extension be owned by the app's owner, so XPC would be an option.

Easier again is to have the extension itself handle the video sourcing, while the accompanying app only provides control and status reporting, using the property interfaces of the extension. That's the intent behind the design.

  • @smith_c could you post sample code for that method? I've added a second device with a sink stream. But then I'm stuck on the app side: accessing the CMIODevice that corresponds to my camera and sending frames to its sink stream is not trivial (CMIO is C only API, doing this in Swift isn't easy).

Add a Comment

it would be really really helpful to have official sample code that shows how the camera extension can receive frame data from the main app in which it was embedded.

Also, it’d be really useful if the camera extension could use some window server APIs such as CGWindowListCreateImage or the newer ScreenCaptureKit API: assuming the rights to access the screen were granted to the bundling app, could the camera extension inherit that right?

that’s the approch I’m using currently in my old DAL plugin for Screegle app to capture frames from an NSWindow that the main app has.

@smith_c is correct: we can add a sink stream to the same device and the app can feed CMSampleBuffers to this sink stream by connecting to it using the CoreMediaIO C API. In my code, the cameraStreamSink (modeled after cameraStreamSource) simply consumes buffers by waiting on,...) and sends them immediately to the source stream by calling!,...)

I can provide source code if needed.

  • Please share if you can! We're able to send into the sink, but having trouble consuming the stream in the extension

  • Where do you call consumeSampleBuffer? It doesn't seem to be a protocol function

Add a Comment

@ldeoue, I think an example of sending the frames from the app to the extension's sink stream would be very helpful.

Here’s a sample project showing how a camera extension can have a source and a sink streams, and how to feed the sink stream with sample buffers from the main app.

  • thank you for this. it was very helpful.

Add a Comment

So, has anybody found any solution to the XPC connection problem? I seem to be facing the same issue.

  • @eskimo Where can we find sample code that shows us how to send sample buffers from a client using the old C style Core Media IO API to a camera extension that contains an output device?

Add a Comment