CILanczosScaleTransform performance

I am trying to use a CIFilter : CILanczosScaleTransform. Basically, I perform two basic steps :

Feeding an image/video frame to the CIFilter and getting the output :

CIFilter *f = [CIFilter filterWithName:@"CILanczosScaleTransform"];

// Downscale the image

[f setValue:[NSNumber numberWithFloat:scaleFactor] forKey:@"inputScale"];

[f setValue:[NSNumber numberWithFloat:1.0] forKey:@"inputAspectRatio"];

[f setValue:inputImage forKey:@"inputImage"];

CIImage *outImage = [f valueForKey:@"outputImage"];

2. In the second step I render the image on to a CVPixel buffer

CVPixelBufferLockBaseAddress( outputPixelBuffer, 0 );

[ciContext render: outImage toCVPixelBuffer:outputPixelBuffer];

CVPixelBufferUnlockBaseAddress(outputPixelBuffer, 0);

Here the cicontext is created from a OpenGL context.

EAGLContext *eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];

NSDictionary *options = @{ kCIContextWorkingColorSpace : [NSNull null] };

CIContext *ciContext = [CIContext contextWithEAGLContext:eaglContext options:options ];


I am aware of the fact that actual scaling happens when the “render” method is called.


Also, I assume that the scaling is happening in GPU since we create the CIContext from an OPEnGL context.

I measured the performance of scaling and it seems to be pretty bad. For a 1920x1080 video of 30 seconds, it takes around 43 seconds on iPhone 5s, iOS8.3 . So, not even real time!!!


Am I missing something here? Is there any configuration/setting which can improve the performance? Or is it a current hardware limitation?

CILanczosScaleTransform performance
 
 
Q