Dissecting a Neural Network

Today, let's look inside a neural network. I'm Joanna Sandretto, the Lead Developer here at Greyfeather, and building on our last couple posts - let's go a step further. 

A good way to understand neural networks is to look at a simple example that implements the XOR operation. XOR is the logical operation for “exclusive or”, which determines whether one, and only one, of two inputs is true. That is, given inputs x and y, the result is true only if x or y is true but not if x and y are both true or both false. We can represent the XOR operation in a truth table as follows where 1 represents true and 0 represents false:

Blog Ideas-13.jpg

The figure below shows a simple neural network with an input layer, one hidden layer and an output layer. The first layer has our two inputs x and y along with a bias unit that has a value of 1. The hidden layer has two nodes plus a bias unit with a value of 1. The threshold is 0, so the neuron will fire when the neuron’s value is greater than 0. The bias units always fire because its value is 1, which is greater than 0. Each node often has a complex function, but for simplicity here each node sums all the input values and compares the result to a threshold. The numbers along the edges connecting the neurons are the activation values that are passed to the next layer when the neuron fires.

As a concrete example, say x is true and y is false. Here x will be 1 and y will be 0, so the neuron associated with x will fire and pass its activation values on to the next layer while y will not. In the second layer, both values are above our threshold 0, so both neurons will fire and pass an activation value to the output layer. The value of the neuron in the output layer is -20+15+15=10, which is greater than 0, so the output of the network is 1.

Now let’s say x is true and y is true. In this case, both x and y will fire and pass their activation values to the next layer. As expected, the result is 0.

The XOR operation is an excellent example of why neural networks are helpful in solving complex problems. As problems become more complex, solving the problem often requires combining multiple functions or operations. Neural networks allow us to do just this. In the case of XOR, we need to check for:

x OR y

x AND y

not x AND not y

If all of these operations can be combined in a network with a single hidden layer, imagine what is possible in a deep neural network with multiple hidden layers...