The amount of anisotropic texture filtering to be used when rendering the material property’s image contents.
- iOS 8.0+
- macOS 10.9+
- tvOS 9.0+
- watchOS 2.0+
Anisotropic filtering is a process that increases the quality of texture rendering when a textured surface appears at an extreme angle relative to the camera. This process works by sampling from multiple mipmap levels of a texture for each rendered pixel—the term anisotropy refers to the number of samples per pixel. A higher anisotropy improves rendering quality, but at a cost to rendering performance.
For example, the image on the left in the figure below uses no anisotropic filtering, resulting in rendering artifacts as the checkerboard pattern recedes into the distance. The other images use higher
max values, reducing rendering artifacts. Anisotropic filtering requires mipmaps, so this property only takes effect if the value of the
mip property is not
SceneKit automatically increases or decreases anisotropy for each rendered pixel as needed to maximize rendering quality, up to the limit specified by this property. The maximum anisotropy level used when rendering is dependent on the graphics hardware in use. Set this property’s value to the
MAXFLOAT constant (the default) to use the highest anisotropy level supported by the GPU. A
max value of
1 or lower disables anisotropic filtering.