Hacker News new | past | comments | ask | show | jobs | submit login

Your approach is going to have a lot of problems.

One of the most linked to page on the internet is the download page for Adobe Reader. It is definitely not spam but millions of those links aren't going to have "the keyword" on the page, so by your logic are bad links. This is an extreme example, but it is not an uncommon scenario.

Furthermore, if you have millions of backlinks, it becomes quite difficult to scrape Google (but you can use services like Authority Labs).




Why are you doing Bad Neighbor link checks if you make something like Adobe Reader?

You don't have to scrape Google they have an API for Search that is about $10 per 1000 calls at volume.

I have done this for BILLIONs of links.


You think the Adobe never built any bad links? I know many such large companies that are spendings hundreds of thousands of dollars a year or more buying links.

Do you have a link to the API, please? Thanks!


Custom Search, Don't specify any rules which would interact with the results you are testing against. (like add a rule that favors a parked domain.)

If they are buying those links then finding the bad ones is as easy as contacting the people they cut checks to.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: