emmanueloga_ 6 days ago

I spent way too long figuring this one out, so this is what I got:

An improvement on [1], which I vaguely remember using with pen and paper to find minimums of differentiable functions. The original algorithm runs "on a loop" (iteratively) and utilizes the first and second order derivative of a function (f', f''). From the article:

> Newton did it for degree 2. He did that because nobody knew how to minimize higher-order polynomials

The improved version looks a lot more complex but seems to sacrifice simplicity to converge faster to the minimum when implemented as a program.

Trivia: a "single loop" of the Newthon method is famously used in Quake's Fast InvSqrt() implementation [2].

--

1: https://en.wikipedia.org/wiki/Newton's_method

2: https://en.wikipedia.org/wiki/Fast_inverse_square_root#Newto...

2
Sniffnoy 6 days ago

There's actually a separate Wikipedia article about the variant/application of Newton's method being improved upon here: https://en.wikipedia.org/wiki/Newton%27s_method_in_optimizat... Note the italic text at the top of the "Newton's method" article!

thrdbndndn 5 days ago

This article is really badly written IMHO. And I feel the content should just be merged back to the main article.

emmanueloga_ 5 days ago

Awesome thank you for the clarification!

nyeah 5 days ago

I wish we could call the Quake algorithm "fast reciprocal square root". But it's far too late now.

lupire 5 days ago

"reciprocal" and "inverse" (and "opposite") are synonyms, but have taken on a different connotations in math.