What could prevent an NSView.layer to renderInContext: ?

I've been working hard, searching the internet on that problem for 3 days and I'm now running out of ressource. Currently porting an iOS app to MacOS (deployment 10.11). The problem:

I have a view hierarchy as below:

NSScrollview

  documentView

    grouping view

      tiling view one

        array of NSImageView (each one being a tile)

      tiling view two

        array of NSImageView (each one being a tile)


The two tiling views are overlaying completely, depending of UI one may be hidden or the second one has opacity set below 1.0 to blend the two tiled view.

Because of opacity requirement, as well as performance, views are CAlayer backed. This is done from IB where the to NSScrollview is checked for Core Animation. Thereoff, all the view tree is (implicitly) layer backed.

Works as expected, scroll, magnify, etc.


Then I need to make an image out of the document view to generate an SCNMaterial content (3D view). On iOS the documentView renderInContext works as expected, and allows an image to be created. On Appkit the context stay transparent, so is the image, a valid object while as if clearColor.

If document view canDawSubviewsIntoLayers is set at creation, the view tree renders OK.This can't be the solution since it prevents opacity setting to work.

Even when one tiling view is hidden (no opacity compositing) rendering fails.


I read that some kinds of views are not rendered. I don't use them. There are no filters, no masks, beside a default masksToBounds setting on all the view tree. I don't know why and where it is set. I tried to unset it on all the views at creation with no success. It is set again somehow, on the grouping view below the documentView. This may be the problem but why this property is out of my control ?

There are plenty of posts, mainly on SO, complaining about CALayer renderInContext and code to custom render a layer tree. Nevertheless, most are quite old and now there must be a simple standard way to achieve it.

Alternative way to get view tree rendering, bitmapImageRepForCachingDisplayInRect + cacheDisplayInRect:toBitmapImageRep: works the same : OK with canDawSubviewsIntoLayers , KO otherwise. Apple code examples to make a texture out of a view are using either one of the two methods.


Among other attempts, I tried to set each view wantsLayer, with no success.

Answered by Max_B in 256993022

Well, as often when you post a request for help, you finally find the answer by yourself… Here it is:

Given the view tree as listed in the question, I achieved to render it by setting canDawSubviewsIntoLayers on each tiling view. This way, the opacity compositing between layers is working, AND the views are rendered.


As of WHY this works, here is my guess, but this is not a authorised answer: Each NSImageView tile, subviews of tiling view, have their origin set to their frame. This is not a transform on the layer but a position of the frame in the coordinates of the tiling view. This is a difference with the iOS code version where the tiles are positioned by a transform AND an offset to the tile.

So the only clue is that the layer system get lost when rendering in context on OSX when some subview have an offset ??

Summary: The key point to the solution is to find the right layer where to set canDawSubviewsIntoLayers


I check this as the answer, because it solves the problem. If someone post a more comprehensive explanation to the rendering failure. I'll change it as the correct answer.

Accepted Answer

Well, as often when you post a request for help, you finally find the answer by yourself… Here it is:

Given the view tree as listed in the question, I achieved to render it by setting canDawSubviewsIntoLayers on each tiling view. This way, the opacity compositing between layers is working, AND the views are rendered.


As of WHY this works, here is my guess, but this is not a authorised answer: Each NSImageView tile, subviews of tiling view, have their origin set to their frame. This is not a transform on the layer but a position of the frame in the coordinates of the tiling view. This is a difference with the iOS code version where the tiles are positioned by a transform AND an offset to the tile.

So the only clue is that the layer system get lost when rendering in context on OSX when some subview have an offset ??

Summary: The key point to the solution is to find the right layer where to set canDawSubviewsIntoLayers


I check this as the answer, because it solves the problem. If someone post a more comprehensive explanation to the rendering failure. I'll change it as the correct answer.

What could prevent an NSView.layer to renderInContext: ?
 
 
Q