Monday, July 6, 2020

Activation Functions in Neural Networks

The simplest one is called 'Step function' or 'Threshold function':

Second is the 'Rectifier function':
Third is the Sigmoid function. We also see this one in 'logictic regression' algorithm.
Fourth is Hyperbolic Tangent.
Fifth is Softplus. Softplus function is a smooth version of the Rectifier function. f(x) = log(1 + exp(x))
Below is an illustrative image showing how multiple activation functions are used in a neural network. Here, one in hidden layers and one in output layer:
A research paper proposes that Rectifier function gives a better performing neural network than one using 'Sigmoid' or 'Hyperbolic Tangent'. Deep Sparse Rectifier Neural Networks, Xavier Glorot, 2011 Sixth: Softmax Function It converts a vector of K real numbers into a probability distribution of K possible outcomes.
Statisticians usually call softmax a "multiple logistic" function. It reduces to the simple logistic function when there are only two categories. The first category is q_1. And suppose you choose to set q_2 to 0. Then exp(q_1) exp(q_1) 1 p_1 = ------------ = ----------------- = ------------- c exp(q_1) + exp(0) 1 + exp(-q_1) sum exp(q_j) j=1 and p_2, of course, is 1 - p_1. Ref: ai-faq/neural-nets

No comments:

Post a Comment