Neural Networks

Activation Functions in ANNs (Conclusion)

In my last article Activation Functions in ANNs, we discussed on few activation functions, now let’s explore more on some other available activation functions. Tanh Function These are scaled sigmoid function which is similar to sigmoid functions. Or It is nonlinear so we can have more than one layer of neurons depending upon the requirement. Its range is (-1, 1).…

Continue Reading

Activation Function
Basics, Neural Networks

Activation Functions in ANNs (Part-1)

Introduction In an ANN the activation function of a node is defined as the threshold after which the node will produce an output given an input or set of inputs. Activation functions can be linear or non-linear but mostly nonlinear functions are being used in ANNs. This is a very important in the way a network learns because in light…

Continue Reading

Basics, Neural Networks, Python

Using K-means to find the optimal nodal center for Radial Basis Function

 Introduction In my previous article on “Introduction to the perceptron algorithm”  we had seen how a single layer perceptron model can be used to classify an OR gate. But when the same model was used to classify a XOR gate it failed miserably. The problem was with linearity, i.e. if the classes are not linearly separable “Single layer perceptron model”…

Continue Reading

Basics, Deep Learning, Neural Networks, Python

Introduction to the Perceptron Algorithm

Perceptron Algorithm The Perceptron model forms the basis of any neural network. This is where it all began and finally led to the development of “Neural Networks” or “Deep Learning” which is kind of the buzz word nowadays. In this article, I am going to show the mathematics behind the well-known Perceptron algorithm using a 2-input 2-output model. Although some amount of linear algebra…

Continue Reading