I spent way too long figuring this one out, so this is what I got:
An improvement on [1], which I vaguely remember using with pen and paper to find minimums of differentiable functions. The original algorithm runs "on a loop" (iteratively) and utilizes the first and second order derivative of a function (f', f''). From the article:
> Newton did it for degree 2. He did that because nobody knew how to minimize higher-order polynomials
The improved version looks a lot more complex but seems to sacrifice simplicity to converge faster to the minimum when implemented as a program.
Trivia: a "single loop" of the Newthon method is famously used in Quake's Fast InvSqrt() implementation [2].
I don't think they're quite synonyms. In math they denote two different things. The reciprocal of f(x) = y is g(x) = 1/y. The inverse of f(x)=y is g(y) = x.
An improvement on [1], which I vaguely remember using with pen and paper to find minimums of differentiable functions. The original algorithm runs "on a loop" (iteratively) and utilizes the first and second order derivative of a function (f', f''). From the article:
> Newton did it for degree 2. He did that because nobody knew how to minimize higher-order polynomials
The improved version looks a lot more complex but seems to sacrifice simplicity to converge faster to the minimum when implemented as a program.
Trivia: a "single loop" of the Newthon method is famously used in Quake's Fast InvSqrt() implementation [2].
--
1: https://en.wikipedia.org/wiki/Newton's_method
2: https://en.wikipedia.org/wiki/Fast_inverse_square_root#Newto...