Functional Programming and Intelligent Algorithms

Week 2: Completing and Using a Neural Network

(Sist oppdatert: $Date$)

Menu

Objective

During this second week, you will add more features to your neural network from last week, and in the process we will discover some more advanced topics of functional programming as well.

Reading

These are primary recommendations. You can find equivalent material in other books if you prefer.

  1. Stephen Marsland, Chapter 2, 4, 6, 14
  2. Simon Thompson, Chapter 11, 13, and 18

Schedule

The schedule will not be prepared until we see how we fare in the first week. The following is an outline of topics which we plan to cover.

Time Topic Reading
Monday 9 February
8:15-10 Review of last week
rest of day Catch up with Tutorials 5-6
Tuesday 10 February
8:15-10 Lecture: From perceptron to back-propagation Marsland Chapter 4
rest of day Implementation and testing of back-propagation
Question and Answer session (TBA)
Wednesday 11 February
8.15-10 Complete your back-propagation networks
10.15-11 Lecture: Pseudo-random number generators
10.15-13 Implement random initialisation of weights
Take a lunch break in some point
13.15-14 Lecture: State Machines and Monads Thompson Chapter 18
rest of day Catch up with exercises
Thursday 12 February
8:30-10 Lecture: Neural Networks in the Real World Marsland Chapter 4
Practical Problems for the Neural Network
Discussion Exercises
Friday 13 February
8:30-10 Lecture: Curse of Dimensionality Marsland Chapter 6
Rest of Day Catch up with previous tutorials
Optional Exercises on dimensionality reduction Marsland Chapter 6