I spent way too long figuring this one out, so this is what I got:
An improvement on [1], which I vaguely remember using with pen and paper to find minimums of differentiable functions. The original algorithm runs "on a loop" (iteratively) and utilizes the first and second order derivative of a function (f', f''). From the article:
> Newton did it for degree 2. He did that because nobody knew how to minimize higher-order polynomials
The improved version looks a lot more complex but seems to sacrifice simplicity to converge faster to the minimum when implemented as a program.
Trivia: a "single loop" of the Newthon method is famously used in Quake's Fast InvSqrt() implementation [2].
--
1: https://en.wikipedia.org/wiki/Newton's_method
2: https://en.wikipedia.org/wiki/Fast_inverse_square_root#Newto...
There's actually a separate Wikipedia article about the variant/application of Newton's method being improved upon here: https://en.wikipedia.org/wiki/Newton%27s_method_in_optimizat... Note the italic text at the top of the "Newton's method" article!
This article is really badly written IMHO. And I feel the content should just be merged back to the main article.