Which of the following is NOT an activation function?
An activation function is a function that determines the output of a neuron in a neural network based on its input. An activation function can introduce non-linearity into a neural network, which allows it to model complex and non-linear relationships between inputs and outputs. Some of the common activation functions are:
Sigmoid: A sigmoid function is a function that maps any real value to a value between 0 and 1. It has an S-shaped curve and is often used for binary classification or probability estimation.
Hyperbolic tangent: A hyperbolic tangent function is a function that maps any real value to a value between -1 and 1. It has a similar shape to the sigmoid function but is symmetric around the origin. It is often used for regression or classification problems.
ReLU: A ReLU (rectified linear unit) function is a function that maps any negative value to 0 and any positive value to itself. It has a piecewise linear shape and is often used for hidden layers in deep neural networks.
Additive is not an activation function, but rather a term that describes a property of some functions. Additive functions are functions that satisfy the condition f(x+y) = f(x) + f(y) for any x and y. Additive functions are linear functions, which means they have a constant slope and do not introduce non-linearity.
Currently there are no comments in this discussion, be the first to comment!