Hacker News new | past | comments | ask | show | jobs | submit login

The reason linear filters don't work for upscaling, say a Lanczos or Sinc, is that the base supposition made for generating the filters is that the source is "bandlimited" -- that is, all images are samples of some source which can at most vary once every pixel. Problem is, while that works very well for regions that are supposed to be smooth -- textures and stuff -- it fails at edges, since edges have infinite bandwidth. The result is a wash out as you can see for linear/bicubic interpolation.

Of course, it does work almost perfectly for downscaling, because in this case we're not creating any edges, just reconstructing the original edges -- an those vary at most once every pixel.




Actually, the effect happens just as much during downscaling. It is just not as noticeable, because your artifacts happen at subpixel scale. But they do affect the output colors.

The math does not really know the difference between upscaling and downscaling.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: