Class

MPSCNNLogSoftMax

A neural transfer function that is useful for constructing a loss function to be minimized when training neural networks.

Declaration

class MPSCNNLogSoftMax : MPSCNNKernel

Overview

The logarithmic softmax filter is calculated by taking the natural logarithm of the result of a softmax filter.

For each feature channel per pixel in an image in a feature map, the logarithmic softmax filter computes the following:

pixel = pixel(x,y,k)) - ln{sum(exp(pixel(x,y,0)) ... exp(pixel(x,y,N-1))}

Where R is the result channel in the pixel, N is the number of feature channels, and y=ln(x) satisfies eʸ=x.

Relationships

Inherits From

Conforms To

See Also

Softmax Layers

class MPSCNNSoftMax

A neural transfer function that is useful for classification tasks.

class MPSCNNLogSoftMaxGradient

A gradient logarithmic softmax filter.

class MPSCNNSoftMaxGradient

A gradient softmax filter.

Beta Software

This documentation contains preliminary information about an API or technology in development. This information is subject to change, and software implemented according to this documentation should be tested with final operating system software.

Learn more about using Apple's beta software