I used this code from this site in old macOS versions to take an image of an NSView:
- (NSImage *)imageRepresentation
{
BOOL wasHidden = self.isHidden;
BOOL wantedLayer = self.wantsLayer;
self.hidden = NO;
self.wantsLayer = YES;
NSImage *image = [[NSImage alloc] initWithSize:self.bounds.size];
[image lockFocus];
CGContextRef ctx = [NSGraphicsContext currentContext].CGContext;
[self.layer renderInContext:ctx];
[image unlockFocus];
self.wantsLayer = wantedLayer;
self.hidden = wasHidden;
return image;
}
It has always worked without problems so far.
Now, with macOS High Sierra (10.13) the returned image is empty. How can I solve the problem?
The solution:
- (NSImage *)imageRepresentation
{
NSSize mySize = self.bounds.size;
NSSize imgSize = NSMakeSize( mySize.width, mySize.height );
NSBitmapImageRep *bir = [self bitmapImageRepForCachingDisplayInRect:[self bounds]];
[bir setSize:imgSize];
[self cacheDisplayInRect:[self bounds] toBitmapImageRep:bir];
NSImage* image = [[NSImage alloc]initWithSize:imgSize] ;
[image addRepresentation:bir];
return image;
}