Saurabh 😎

WWDC 2018: Image and Graphics Best Practices

Image buffer = in-memory representation of an image (i.e. after decoding jpeg file)
Buffer size is proportional to image size

Data buffer = store raw image file in encoded format (so bytes don't directly describe pixels)

Decoding: data buffer => image buffer
Decoding is CPU-intensive, so image buffer is cached

Note that even after decoding, the image buffer size is proportional to the size of the original image, not rendered image - can easily lead to high memory usage => memory fragmentation, poor locality of reference, OS will use CPU to compress memory, process termination

Downsampling = reduce size of decoded image buffer => lower memory usage

After downsampling, can then discard original data buffer
See func downsample code sample from slides

Decoding images in scrollable views can cause laggy scrolling, and is also bad for battery life since CPU spikes every time a new cell appears in scroll view
2 ways to avoid:

For included images in your application, always use Image Assets
You'll get many optimizations for free (see related WWDC session: Optimizing App Assets)

For UIKit drawing, it is recommended to not override drawRect:, but instead, compose the view using UIImageView, UILabel, and CALayer properties - main advantage is it avoids a memory allocation for the backing store for the drawRect:, and instead either uses heavily-optimized UIKit classes or CALayer properties that don't require any extra backing stores