# Functional Programming and Intelligent Algorithms

## Tutorial 1.7: Linear and non-linear classifiers

*
(Sist oppdatert: 1 January 2015)*

### Overview

**Reading:** Stephen Marsland: Chapter 3.4-3.5 and
the introduction for Chapter 4.

### Problem 1: A Neural Network for the XOR problem

#### Step 1: The multi-neuron perceptron (a single layer)

If you have not already completed it, it is time, now, to
do Problem 2 of Tutorial 5.
So that you have a data type `Layer`

to handle
one individual layer (hidden or output) in the neural network.

#### Step 2: Data type for multi-layer networks

Define a data type `Network`

for the neural network,
consisting of a list of layers.

#### Step 3: A network for XOR

- The XOR network has two layers
- Two neurons in the hidden layer
- One neuron in the output layer

- Each of the three neurons has two inputs and three weights.

*
Look up the weights used to solve the XOR problem in Marsland's book.
Define the three neurons with appropriate weights, and define the
network assembling the three nodes.
*

Test your definition by evaluating it in GHCi.

#### Step 4: Recall for multi-layer networks

Having completed
Problem 2 of Tutorial 5
previously, you have a recall function for a single layer.
Now, we need the following recall function for a neural network.

```
recallNetwork :: Network -> InputVector -> OutputValue
```

This function will need to do recall for the first layer in
the network (list of layers). The output from the recall of
the first layer is used as input for recall in the second
layer. This continues throughout the list, and the output
from the last layer is the output of `recallNetwork`

.

*
Implement *`recallNetwork`

.

#### Step 5: Testing

*
Finally, test your definitions with the following evaluations:
*

```
recallNetwork xorNetwork [0,0]
recallNetwork xorNetwork [0,1]
recallNetwork xorNetwork [1,0]
recallNetwork xorNetwork [1,1]
```

#### Step 6: Bug search

There are two common sources of errors in this network/implementation.

- What is the value of the threshold function at 0?
I.e. does the neuron fire when the sum is exactly 0.
In floating point problems, this hardly matters, but with
the binary xor problem it does. If your test fails, try to change
the threshold function.
- What is the sign of weight 0, i.e. the weight corresponding to
the constant quasi-input?
Very often we use -1, but the XOR network
of Marsland has +1 as its quasi-input. If your network uses -1,
you have to swap the sign on the weight.