A neural transfer function that is useful for classification tasks.


class MPSCNNSoftMax : MPSCNNKernel


The softmax filter is applied across feature channels in a convolutional manner at all spatial locations. The softmax filter can be seen as the combination of an activation function (exponential) and a normalization operator.

For each feature channel per pixel in an image in a feature map, the softmax filter computes the following:

pixel = exp(pixel(x,y,k))/sum(exp(pixel(x,y,0)) ... exp(pixel(x,y,N-1))

Where R is the result channel in the pixel and N is the number of feature channels.


Inherits From

Conforms To

See Also

Softmax Layers

class MPSCNNLogSoftMax

A neural transfer function that is useful for constructing a loss function to be minimized when training neural networks.

class MPSCNNLogSoftMaxGradient

A gradient logarithmic softmax filter.

class MPSCNNSoftMaxGradient

A gradient softmax filter.

Beta Software

This documentation contains preliminary information about an API or technology in development. This information is subject to change, and software implemented according to this documentation should be tested with final operating system software.

Learn more about using Apple's beta software