NSData memory mapped WWDC session 212

I've found the session Optimizing Your App for Multitasking on iPad in iOS 9 really interesting.

Thanks to a Carmack quote in 2011 I started to know about memory mapping in iOS, but I've never found a concrete example about how to achieve or where to apply that.

The session comes whitout a sample code or playground file and is a pitty because I'd like to recreate a sort of a scenario to profile.

Back into 2013 out of curiosity I wrote this answer on stack overflow about how to memory map big images, I never had a clear reply.

Since the session talks about memory map sets of images whithout impact the RAM I'm not getting if I didn't understand some base concept or what I was doing wrong.

I've test memory map with these simple lines of codes, but profililing with allocation I always saw the image allocated in RAM as dirty and crashes on real device due to the huge size of the image.

NSError * __autoreleasing error = nil;
NSData * mappedData = [NSData dataWithContentsOfFile:[[NSBundle mainBundle] pathForResource:@"tigerbig" ofType:@"png"] options:NSDataReadingMappedAlways | NSDataReadingUncached error:&error];
UIImage * thebigImage =[UIImage imageWithData:mappedData]; 
self.imageView.image = thebigImage;


Can someone explain to me what I'm doing wrong?

Answered by junkpile in 12294022

UIImage needs all the data all at once. It can't take advantage of a memory mapped file. As was pointed out in your SO question, you would need to do your own CATiledLayer or some other approach that only loads small bits of the data at once.

Accepted Answer

UIImage needs all the data all at once. It can't take advantage of a memory mapped file. As was pointed out in your SO question, you would need to do your own CATiledLayer or some other approach that only loads small bits of the data at once.

NSData memory mapped WWDC session 212
 
 
Q