Apply the four optimization algorithms *gradient descent*, *genetic algorithms*, *simulated annealing*, and *particle swarm optimization* to minimize each of the following four 2-dimensional functions;

- 1)
- Rastrigin:
- 2)
- Rosenbrock:
- 3)
- Ackley:
- 4)
- Chasm:

- a)
- Download the MATLAB code for the genetic algorithm toolbox (
`gatoolbox`), simulated annealing (`anneal.m`) and the calculation of the gradient (`gradient.m`).^{1} - b)
- Implement the particle swarm optimization algorithm. Complete the MATLAB function
`pso.m`. - c)
- Implement the gradient descent algorithm with adaptive learning rate by estimating the gradient with finite differences as provided in the MATLAB function
`gradient.m`. - d)
- In order to carry out the optimization you can write your own MATLAB code or use the code template provided in
`compare.m`. Missing code fragements that have to be completed are marked with "`... HOMEWORK ...`". - e)
- For each of the four optimization problems set the free parameters of each of the four optimization algorithm to appropriate values. The free parameters are marked with "
`... HOMEWORK ...`" in the MATLAB code. Explain and justify your choice. - f)
- Repeat the minimization for each of the four functions with each of the four algorithms 10 times (with the parameters settings chosen in e) ) and calculate the mean and the standard error of the mean (SEM) of the resulting minimum function values and the corresponding run times.
- g)
- Compare the results obtained in f) and interpret possible strengths and weaknesses of each optimization algorithm with respect to the optimized functions. Hand in graphical illustrations of your results that support all your statements.