And of course there are lots of other issues. For example the Web is very hostile when you are not Google Bot (i.e. there are a lot of big sites that will forbid you from crawling their content, unless you are Google, or Bing).
The money mentioned in that article is about being the default search engine, not being a functional one. Basically the question I'm asking is why DDG has to outsource to Bing.
> there are a lot of big sites that will forbid you from crawling their content, unless you are Google, or Bing
If you're talking about robots.txt, that's just a suggestion. It doesn't hold any actual preventative power.
And of course there are lots of other issues. For example the Web is very hostile when you are not Google Bot (i.e. there are a lot of big sites that will forbid you from crawling their content, unless you are Google, or Bing).