Using OpenGL Buffers

This article describes the methods that your video data source object needs to implement to use OpenGL for iChat Theater. Read Setting the Video Data Source for how to set a video data source object before reading this article. If your video data source uses pixel buffers, read Using Pixel Buffers.

Getting the Video Format

Your video data source object needs to implement the getOpenGLBufferContext:pixelFormat: IMVideoDataSource protocol method to return the OpenGL context and pixel format for the video content. The IMAVManager object needs this information to properly display and transmit the video.

- (void)getOpenGLBufferContext:(CGLContextObj *)contextOut pixelFormat:(CGLPixelFormatObj *)pixelFormatOut {
    *contextOut = context;
    *pixelFormatOut = pixelFormat;

Typically, the context and pixel format objects are created and retained by the video data source object in the designated initializer. This code fragment creates an OpenGL context and pixel format object:

        long npix = 0;
        CGLPixelFormatAttribute attributes[] = {
            kCGLPFAColorSize, 24,
        CGLChoosePixelFormat(attributes, &pixelFormat, (void*)&npix);
        CGLCreateContext(pixelFormat, [[self openGLContext] CGLContextObj], &context);

Rendering Video Frames

Your video data source object needs to implement the renderIntoOpenGLBuffer:onScreen:forTime: IMVideoDataSource protocol method to render the OpenGL content into the buffer. The Instant Message framework specifies the screen when invoking the renderIntoOpenGLBuffer:onScreen:forTime: method so it can be more efficient when the computer has multiple graphics cards.

Note that OpenGL is not thread-safe so if you are rendering to the display and the buffer at the same time, you need to use the OpenGL macros to render in two different contexts—the default context for the display and an alternate context for the buffer—as described in Improving Performance in OpenGL Programming Guide for Mac.

This implementation of the renderIntoOpenGLBuffer:onScreen:forTime: method in an NSView subclass uses the OpenGL macros to render into the passed OpenGL buffer using an alternate context:

- (BOOL) renderIntoOpenGLBuffer:(CVOpenGLBufferRef)buffer onScreen:(int *)screenInOut forTime:(CVTimeStamp*)timeStamp {
    // We ignore the timestamp, signifying that we're providing content for 'now'.
    // Make sure we agree on the screen ID.
    CGLContextObj cgl_ctx = _alternateContext;
    CGLGetVirtualScreen(cgl_ctx, screenInOut);
    // Attach the OpenGLBuffer and render into the _alternateContext.
    if (CVOpenGLBufferAttach(buffer, _alternateContext, 0, 0, *screenInOut) == kCVReturnSuccess) {
        // In case the buffers have changed in size, reset the viewport.
        CGRect cleanRect = CVImageBufferGetCleanRect(buffer);
        glViewport(CGRectGetMinX(cleanRect), CGRectGetMinY(cleanRect), CGRectGetWidth(cleanRect), CGRectGetHeight(cleanRect));
        // Render
        [self _renderInContext:_alternateContext];
        return YES;
    } else {
        // This should never happen.  The safest thing to do if it does is return
        // 'NO' (signifying that the frame has not changed).
        return NO;

The _renderInContext: method in the sample code does the actual rendering using the supplied context and is also invoked by the drawRect: method as shown in this code fragment:

- (void) _renderInContext:(CGLContextObj)cgl_ctx {
    glClearColor(0.0, 0.0, 0.0, 1.0);
    /* ... */
- (void) drawRect:(NSRect)rect {
    // Render in the normal context.
    [self _renderInContext:[[self openGLContext] CGLContextObj]];

See OpenGL Programming Guide for Mac for more information on using OpenGL.