# Dimensionality Reduction

## Functional Programming and Intelligent Algorithms

*
(Sist oppdatert: 1 January 2015)*

At this point, the priority is to complete exercises implementing
and using the neural network.

If you have time, the following exercises can give practice on
the techniques from today's lecture.
They require a certain level of independence and may be
challenging.

### Feature Selection

Consider your own neural network and one of the datasets
(iris or breast cancer).
Try to remove features from the data set and see if it
improves features.
Which features can be removed with no penalty?
Can any be removed for benefit?

### Fisher Linear Discriminant

Although Marsland presents LDA in the context of dimensionality
reduction, that is not the original purpose of LDA.
The lecture focused on the Fisher Linear Discriminant (FLD)
which is LDA as a classifier.

Implement FLD on one of the two datasets we have worked on
(breast cancer or iris).
Note that the formulĂ¦ are simple and based
on fundamental functions of statistics.
If you have any experience using spreadsheets, it should be
straight-forward to implement FLD in such software.
Of course, you may implement FLD in Haskell if you prefer.
You may also use Marsland's python code.

Hans Georg Schaathun /
hasc@hials.no