optimization_algorithms/README.md

16 lines
477 B
Markdown
Raw Normal View History

2024-11-20 00:20:40 +03:00
# Optimization Algorithms
2024-11-20 00:20:40 +03:00
## Steepest Descent
1- We need starting solution x^t. Zeroise the iteration which is t. Specify tolerance value as ε.
2024-11-20 00:20:40 +03:00
2024-11-20 00:20:40 +03:00
2- at x^t point calculate g^t gradient and ||g^t|| then if ||g^t|| <= ε stop it, else continue.
2024-11-20 00:20:40 +03:00
2024-11-20 00:20:40 +03:00
3- Specfify road direction as d^t = -g^t.
2024-11-20 00:20:40 +03:00
2024-11-20 00:20:40 +03:00
4- Calculate f(x^t + a^t*d^t) as like a^t (step size) is minimum.
2024-11-20 00:20:40 +03:00
2024-11-20 00:20:40 +03:00
5- Calculate new solution point based on: x^(t+1) = x^t + a^t*d^t.
2024-11-20 00:20:40 +03:00
6- Increase iteration counter by 1 and go to 2.step.