Hacker News new | past | comments | ask | show | jobs | submit login

I spent way too long figuring this one out, so this is what I got:

An improvement on [1], which I vaguely remember using with pen and paper to find minimums of differentiable functions. The original algorithm runs "on a loop" (iteratively) and utilizes the first and second order derivative of a function (f', f''). From the article:

> Newton did it for degree 2. He did that because nobody knew how to minimize higher-order polynomials

The improved version looks a lot more complex but seems to sacrifice simplicity to converge faster to the minimum when implemented as a program.

Trivia: a "single loop" of the Newthon method is famously used in Quake's Fast InvSqrt() implementation [2].

--

1: https://en.wikipedia.org/wiki/Newton's_method

2: https://en.wikipedia.org/wiki/Fast_inverse_square_root#Newto...




There's actually a separate Wikipedia article about the variant/application of Newton's method being improved upon here: https://en.wikipedia.org/wiki/Newton%27s_method_in_optimizat... Note the italic text at the top of the "Newton's method" article!


This article is really badly written IMHO. And I feel the content should just be merged back to the main article.


Awesome thank you for the clarification!


I wish we could call the Quake algorithm "fast reciprocal square root". But it's far too late now.


"reciprocal" and "inverse" (and "opposite") are synonyms, but have taken on a different connotations in math.


I don't think they're quite synonyms. In math they denote two different things. The reciprocal of f(x) = y is g(x) = 1/y. The inverse of f(x)=y is g(y) = x.

https://www.thesaurus.com/browse/reciprocal

https://www.thesaurus.com/browse/inverse




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: