Class

MPSCNNNeuronReLU

A ReLU (Rectified Linear Unit) neuron filter.

Declaration

class MPSCNNNeuronReLU : MPSCNNNeuron

Overview

For each pixel in an image, the filter applies the following function:

f(x) = x if x >= 0 | a * x if x < 0

This filter is called leaky ReLU in CNN literature. Some CNN literature defines classical ReLU as max(0, x). If you want this behavior, simply set the a property to 0.

Topics

Initializers

init(device: MTLDevice, a: Float)

Initializes a ReLU neuron filter.

Deprecated

Relationships

Inherits From

Conforms To

See Also

Neuron Layers

class MPSCNNNeuronAbsolute

An absolute neuron filter.

class MPSCNNNeuronELU

A parametric ELU neuron filter.

class MPSCNNNeuronHardSigmoid

A hard sigmoid neuron filter.

class MPSCNNNeuronLinear

A linear neuron filter.

class MPSCNNNeuronPReLU

A parametric ReLU (Rectified Linear Unit) neuron filter.

class MPSCNNNeuronReLUN

A ReLUN neuron filter.

class MPSCNNNeuronSigmoid

A sigmoid neuron filter.

class MPSCNNNeuronSoftPlus

A parametric softplus neuron filter.

class MPSCNNNeuronSoftSign

A softsign neuron filter.

class MPSCNNNeuronTanH

A hyperbolic tangent neuron filter.

class MPSCNNNeuron

A filter that applies a neuron activation function.

class MPSCNNNeuronExponential

An exponential neuron filter.

class MPSCNNNeuronGradient

A gradient neuron filter.

class MPSCNNNeuronLogarithm

A logarithm neuron filter.

class MPSCNNNeuronPower

A power neuron filter.

class MPSNNNeuronDescriptor

An object that specifies properties used by a neuron kernel.

Beta Software

This documentation contains preliminary information about an API or technology in development. This information is subject to change, and software implemented according to this documentation should be tested with final operating system software.

Learn more about using Apple's beta software