Revision 7412d2d977fa676cff6583ccb2ad031090606c1d (click the page title to view the current version)

Optimisation and GA

Changes from 7412d2d977fa676cff6583ccb2ad031090606c1d to 163a2df05371da0b237f51604a91f8fb0f20f980

---
title: Optimisation and GA
categories: session
---

# Reading

The textbook chapters are available at BlackBoard.

+ Haupt and Haput Chapter 1 (cursory)
    + we will not dwell on the various optimisation algorithms discussed here
    + the general background is important though
    + R&N Chapter 4.1-4.2 give a briefer introduction to the same material
+ Haupt and Haput Chapter 2 
    + The main point here is the binary GA, which is summarised as Matlab code
      at the end of the chapter
    + There are many variants of each constituent step, including cross-over,
      matchmaking, and mutation

# Briefing

+ Optimisation is a general problem
    - intelligent agents optimise actions 
    - functions are optimised in all kinds of decision making
+ Two classic approaches
    - exhaustive approaches - exploritng the entire solution space is rarely tractible
    - iteratively improving a tentative solution is easily stuck in local maxima
+ Population based methods
    - multiple tentative solutions scattered over the search space
    - reduces the risk of getting stuck in a local optimum,
      while avoiding exhaustive search
    - many variants
+ Most basic - look up R&N
+ Most basic - look up R&N  Chapter 4.1-4.2
+ Genetic algorithms
    - overview of the demo implementation

# Exercises

I have provided a demo implementation of GA and Local Beam Search on
[github](https://github.com/hgeorgsch/pygax).  

Your goal should be to

1.  Gain some experience with these algorithms on different problems,
    exploring the limitations of each one.
2.  Learn to adapt the GA by changing selection, crossover, and mutation.
3.  Understand how the algorithms work in general.

It is recommended to work together, and possibly split tests between
you to compare more results.  Feel free to test other variations than 
those proposed below.  Make sure you keep a log of the tests that you 
make, so that you can compare performance.

##  First exploration

Have a look at the code and the test scripts.  Check that they work 
on your system.  What functions are optimised in the test scripts?

Answer the following questions by testing,

1.  What is the run time of each algorithm?  Is there a lot of variation?
2.  How many iterations (generations) and how large population is necessary
    to get accurate results? 
3.  Are the results consistent between runs?  Or do you tend to get different,
    random solutions each time?  How much does this depend on the generation count
    and population size?
4.  Which algorithm is better, GA or Local Beam?  
    Is the answer the same for both functions?

##  Basic Hill Climbing

Setting the  population size to one, Local Beam Search degenerates
to ordinary Hill Climbing.

1.  Test this Hill Climbing algorithm a couple of times on the F5 function
    (as used in `testBeam.py`).
2.  Do you get consistent results?
3.  Does the number of iterations make a difference?
4.  What is the major limitation of Hill Climbing?

## Varying the cost function

1. Make a new test script, changing the cost function.
   You can use one of the test functions from Haupt and Haupt Appendix I
   (published on BB), or your own.
2. Can you find functions where the GA performs badly?
   Where Beam Search is bad?
3. Do the two algoritms struggle on the same or different cost functions?

## Variations of the GA

1.  Discuss, what alternatives are there to the default mating, crossover, and
    mutation functions?
2.  Implement and test a few such alternatives.
3.  Can you improve the performance on any of the test functions by using such 
    alternative constituent functions?

## Variations of the GA

# Debrief

Reflection.

- What worked well?
- What worked badly?
- What have you learnt?
- What more do you need to learn?