Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think they have been trying for a long time, but while it's possible to conceive the idea in a second, it might be harder to find the right algorithms to do it in an effective way. After all they make changes every day and study the response of users randomly distributing new versions of the algorithm to a small group of users that are effectively beta testers without even knowing it. It is a known problem, that's true, it can be tricky. For example it is massively clear for webpages that have keywords such as "iphone" "jailbreak" and so on that pretty much have just ads or want to sell you something without really any content in the page. But if you try to find, say something about a scientific paper in blogs, you don't want to penalize the page of a good blogger that has a few ads towards maybe a crazy crackpot that has their own idea about the universe and knows nothing about real science just because their pages have no ads.


well, i think also Google has a motivation to make the search results as bad as they can get away with.

a crappy page with nothing but ads is going to get clicks, and since Google is the #1 ad network, it means more money for Google.

if Google organic search results were perfect, people would never click on the ads. the worse the results are, the more likely the ads are better and you get trained to click... and ker-ching!

Google needs to be good enough to (i) discourage mass defection to Bing and Duck-Duck-Go, (ii) not produce public outrage as happened when A list bloggers were getting outranked by duplicate content, and (iii) not get in trouble for anti-trust. (Hint: if you want to run a spam farm, buy a second or third tier search engine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: