- iOS 8.0+
- macOS 10.9+
- tvOS 9.0+
- watchOS 2.0+
Texture filtering determines the appearance of a material property’s contents when portions of the material surface appear larger or smaller than the original texture image. For example, when a texture is applied to a plane that recedes away from the camera into the distance:
The texture coordinates at a point near the camera may correspond to a small fraction of a pixel in the original image. SceneKit uses the
magnificationproperty to determine the color of the sampled texel at that point.
The texture coordinates at a point far from the camera may correspond to an area of several pixels in the original image. SceneKit uses the
minificationproperty to determine the color of the sampled texel at that point.
SceneKit also uses the filter specified by the
mip property when generating mipmap levels for a texture image.