Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> there are no important applications for those series.

I cannot believe I just read this.



When have you ever used the Taylor series of sine and cosine for anything (outside school) ?

When you approximate functions by polynomials, including the trigonometric functions, the Taylor series are never used, because they are inefficient (too much computation for a given error). Other kinds of polynomials are used for function approximations.

The Taylor series are a tool used in some symbolic computations, e.g. for symbolic derivation or symbolic integration, but even in that case it is extremely unlikely for the Taylor series of the trigonometric functions to be ever used. What may be used are the derivative formulas for trigonometric functions, in order to expand an input function into its Taylor series.

The Taylor series of arbitrary functions (more precisely, the first few terms) may be used in the conception of various numeric algorithms, but here there are also no opportunities to need the Taylor series of specific functions, like the trigonometric functions.

The Taylor series obviously have uses, but the specific Taylor series for the trigonometric functions do not have practical applications, even if they are interesting in mathematical theory.


> When you approximate functions by polynomials, including the trigonometric functions, the Taylor series are never used, because they are inefficient (too much computation for a given error). Other kinds of polynomials are used for function approximations.

Can you point me to some implementation of sin that’s not actually using Taylor expansion in some form? Because most that I am aware of do in fact use Taylor series (others are just table lookup). See glibc for example:

https://github.com/bminor/glibc/blob/release/2.34/master/sys...

And here is musl

https://git.musl-libc.org/cgit/musl/tree/src/math/__sin.c

(The constants are easily checked to be -1/3!, 1/5! Etc)

This might have something to do with the Taylor’s theorem. You know, that the Taylor’s polynomial of the order n is the only polynomial of order n that satisfies |f(x)-T(x)|/(x-a)^(n+1) -> 0 as x -> a. In other words, the Taylor polynomial of order n is the unique polynomial approximation to f around a to the order n+1. This means you cannot get any better than Taylor close to the origin of the expansion. This causes implementers to focus on argument reductions instead of selecting polynomials.


If any of those libraries uses the Taylor expansion for approximation, that is a big mistake, because the approximation error becomes large at the upper end of the argument interval, even if it is small close to zero.

What is much more likely is that if you will carefully compare the polynomial coefficients with those of the Taylor series, you will see that the last decimals are different and the difference from the Taylor series increases towards the coefficients corresponding to higher degrees.

Towards zero, any approximation polynomial begins to resemble the Taylor series in the low-degree terms, because the high-degree terms become negligible and the limit of the Taylor series and of the approximation polynomial is the same.

So when looking at the polynomial coefficients, they should resemble those of the Taylor series in the low-degree coefficients, but an accurate coefficient computation should demonstrate that the polynomials are different.


I did some testing, and you are correct that they are slightly different:

  >>> print("{:.30f}".format(sin(pi/4, 0, 0, *standard_coeff)))
  0.707106781186567889818661569734
  >>> print("{:.30f}".format(sin(pi/4, 0, 0, *musl_coeff)))
  0.707106781186547461715008466854
The difference at the very edge of the interval occurs at the 14th digit of decimal expansion, and it's at the edge of accuracy of double, at 16th digit: after ...6547, the exact value starts with ...6547_5244, instead of 4617. I wouldn't exactly call it a big mistake, as the difference would not be relevant in almost all practical uses, but that would be a mistake nevertheless, and I'm sure someone would be bitten by this. Thanks, I learned something new today!


> When have you ever used the Taylor series of sine and cosine for anything (outside school) ?

I've used them a few times, mostly in the embedded space, and mostly in conjunction with lookup tables and/or Newton's method, but yes I've absolutely used them outside school (years ago, I forget the exact details).

- implementing my own trig functions for embedded applications where I wanted fine control over the computation-vs-precision tradeoff

- implementing my own functions for hypercomplex numbers (quaternions, duals, dual quaternions, and friends).

- automatic differentiation

Does the Taylor series form survive to the final application? Usually not, usually it gets optimized to something else, but "start with Taylor series and get back to basics to get a slow but accurate function" has gotten me out of several pickles. And the final form usually has some chunks of the Taylor series.


I agree that using the Taylor series can be easier, especially during development, mainly because convenient tools for generating approximation polynomials or other kinds of approximating functions are not widespread.

However, the performance when using Taylor series is guaranteed to be worse than when using optimal approximation polynomials, according to appropriate criteria.

Still I cannot see when you would want to use the Taylor series of the trigonometric functions, even if for less usual functions it could be handy.

There are plenty of open-source libraries with good approximations of the trigonometric functions, so there is no need to develop one's own.

In the case of a very weak embedded CPU there is the alternative to use CORDIC for the trigonometric functions, instead of polynomial approximations. CORDIC can be very accurate, even if on CPUs with fast multipliers it is slower than polynomial approximation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: