Class

MPSCNNNeuronPReLU

A parametric ReLU (Rectified Linear Unit) neuron filter.

Declaration

@interface MPSCNNNeuronPReLU : MPSCNNNeuron

Overview

For each pixel in an image, the filter applies the following function:

f(x_i) = x_i, if x_i >= 0 | a_i * x_i if x_i < 0

Where i in [0 ... channels - 1]. That is, parameters a are learned and applied to each channel separately. Compare this to MPSCNNNeuronReLU where parameter a is shared across all channels.

Topics

Initializers

Relationships

Inherits From

See Also

Neuron Layers

MPSCNNNeuronAbsolute

An absolute neuron filter.

MPSCNNNeuronELU

A parametric ELU neuron filter.

MPSCNNNeuronHardSigmoid

A hard sigmoid neuron filter.

MPSCNNNeuronLinear

A linear neuron filter.

MPSCNNNeuronReLUN

A ReLUN neuron filter.

MPSCNNNeuronReLU

A ReLU (Rectified Linear Unit) neuron filter.

MPSCNNNeuronSigmoid

A sigmoid neuron filter.

MPSCNNNeuronSoftPlus

A parametric softplus neuron filter.

MPSCNNNeuronSoftSign

A softsign neuron filter.

MPSCNNNeuronTanH

A hyperbolic tangent neuron filter.

MPSCNNNeuron

A filter that applies a neuron activation function.

MPSCNNNeuronExponential

An exponential neuron filter.

MPSCNNNeuronGradient

A gradient neuron filter.

MPSCNNNeuronLogarithm

A logarithm neuron filter.

MPSCNNNeuronPower

A power neuron filter.

MPSNNNeuronDescriptor

An object that specifies properties used by a neuron kernel.

Beta Software

This documentation contains preliminary information about an API or technology in development. This information is subject to change, and software implemented according to this documentation should be tested with final operating system software.

Learn more about using Apple's beta software