Class

MPSCNNNeuronReLU

A ReLU (Rectified Linear Unit) neuron filter.

Declaration

@interface MPSCNNNeuronReLU : MPSCNNNeuron

Overview

For each pixel in an image, the filter applies the following function:

f(x) = x if x >= 0 | a * x if x < 0

This filter is called leaky ReLU in CNN literature. Some CNN literature defines classical ReLU as max(0, x). If you want this behavior, simply set the a property to 0.

Relationships

Inherits From

See Also

Neuron Layers

MPSCNNNeuronAbsolute

An absolute neuron filter.

MPSCNNNeuronELU

A parametric ELU neuron filter.

MPSCNNNeuronHardSigmoid

A hard sigmoid neuron filter.

MPSCNNNeuronLinear

A linear neuron filter.

MPSCNNNeuronPReLU

A parametric ReLU (Rectified Linear Unit) neuron filter.

MPSCNNNeuronReLUN

A ReLUN neuron filter.

MPSCNNNeuronSigmoid

A sigmoid neuron filter.

MPSCNNNeuronSoftPlus

A parametric softplus neuron filter.

MPSCNNNeuronSoftSign

A softsign neuron filter.

MPSCNNNeuronTanH

A hyperbolic tangent neuron filter.

MPSCNNNeuron

A filter that applies a neuron activation function.

MPSCNNNeuronExponential

An exponential neuron filter.

MPSCNNNeuronGradient

A gradient neuron filter.

MPSCNNNeuronLogarithm

A logarithm neuron filter.

MPSCNNNeuronPower

A power neuron filter.

MPSNNNeuronDescriptor

An object that specifies properties used by a neuron kernel.

Beta Software

This documentation contains preliminary information about an API or technology in development. This information is subject to change, and software implemented according to this documentation should be tested with final operating system software.

Learn more about using Apple's beta software