Determining the quantity of weights and biases in an ANN

I have created this example to remind myself of how to go about determining the weights and biases in an Artificial Neural Network (ANN). Calculating this is not something I regularly do. So, this is just a quick refresher in case I forget. Let’s get to it!

Imagine we are going to feed 2 inputs through an ANN with 1 hidden layer. Layer 1 has 2 neurons; with a softmax output layer with 1 neuron that provides a binary output. A visual example is shown below. How many weights and biases exist in our neural network?

Biases

The biases are based on the number of neurons in the layers.

LayerBias
Hidden layer (C and D)2
Output layer (E)1
Total 3

Weights

The weights are applied to the connections between inputs and layers.

ConnectionsWeights
Between the input layer and the first hidden layer (2 * 2) || AC+AD+BC+BD4
Between the hidden layer and the output layer (2 * 1) || CE+DE2
Total6

So, in the diagram above we have a total of 3 biases and 6 weights. Why is this relevant? This only becomes relevant when we consider the process of backpropagation. It is how a neural network learns, and it does this by adjusting the eights across the various neurons so that it reduces the error loss generated by the neural network.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *