A hard sigmoid neuron filter.


class MPSCNNNeuronHardSigmoid : MPSCNNNeuron


For each pixel in an image, the filter applies the following function:

f(x) = clamp((a * x) + b, 0, 1)



init(device: MTLDevice, a: Float, b: Float)

Initializes a hard sigmoid neuron filter.



Inherits From

Conforms To

See Also

Neuron Layers

class MPSCNNNeuronAbsolute

An absolute neuron filter.

class MPSCNNNeuronELU

A parametric ELU neuron filter.

class MPSCNNNeuronLinear

A linear neuron filter.

class MPSCNNNeuronPReLU

A parametric ReLU (Rectified Linear Unit) neuron filter.

class MPSCNNNeuronReLUN

A ReLUN neuron filter.

class MPSCNNNeuronReLU

A ReLU (Rectified Linear Unit) neuron filter.

class MPSCNNNeuronSigmoid

A sigmoid neuron filter.

class MPSCNNNeuronSoftPlus

A parametric softplus neuron filter.

class MPSCNNNeuronSoftSign

A softsign neuron filter.

class MPSCNNNeuronTanH

A hyperbolic tangent neuron filter.

class MPSCNNNeuron

A filter that applies a neuron activation function.

class MPSCNNNeuronExponential

An exponential neuron filter.

class MPSCNNNeuronGradient

A gradient neuron filter.

class MPSCNNNeuronLogarithm

A logarithm neuron filter.

class MPSCNNNeuronPower

A power neuron filter.

class MPSNNNeuronDescriptor

An object that specifies properties used by a neuron kernel.

Beta Software

This documentation contains preliminary information about an API or technology in development. This information is subject to change, and software implemented according to this documentation should be tested with final operating system software.

Learn more about using Apple's beta software