Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> hide sites by domain

This gives me the idea to build a search engine that only contains content from domains that have been vouched for. Basically, you'd have an upvote/downvote system for the domains, perhaps with some walls to make sure only trusted users can upvote/downvote. It seems like in practice, many people do this anyway. This could be the best of both worlds between directories and search engines.



I don't think this would change a lot, you would probably raise big sites (Pinterest, Facebook) a lot higher in the rankings as the 99% non-programmers would vouch for them.

You could counter that somewhat by having a "people who liked X also like Y" mechanism, but that quickly brings you back to search bubbles.

In that sense Google probably should/could do a better job by profiling you and if you never click through to a page lower it in the rankings. Same with preferences, if I am mainly using a specific programming language and search for "how to do X" they could only give me results on that language.

In the end that will probably make my search results worse, as I am not only using one language ... and sometimes I actually click on Pinterest :-(


You don't need an upvote/downvote. If someone searches for X and clicks on results you just record when they stop trying sites or new search terms as you can assume the query has been answered. Reward that site. Most of them are already doing this in some form.


This is what Google already does, does it not? Why else would they be recording outbound clicks?

Unfortunately, this doesn't entirely solve the problem. Counting clicks doesn't work because you don't know if the result was truly worthwhile or if the user was just duped by clickbait.

As you say, clicking when they stop trying sites is better, but I don't know how good that signal to noise ratio is. A low-quality site might answer my question, sort of, perhaps in the minimal way that gets me to stop searching. But perhaps it wouldn't answer it nearly as well as a high-quality site that would really illuminate and explain things and lead to more useful information for me to explore. Both those scenarios would look identical to the click-tracking code of the search engine.


If I click on link 1 then click on link 2 several minutes later, 1 probably sucked. The difficulty is if I click on 1 and then 2 quickly, it just means I’m opening a bunch of tabs proactively.


Often you don’t know if a site is legit or not without first visiting it.

And new clone sites launch all the time, so I’m always clicking on search results to new clone sites that I’ve never seen before so can’t avoid them in results.


Yeah, when I get caught by these SEO spam sites it's because they haven't had a similar ranking to the SO thread that ripped off, so it wasn't immediately apparent.


> This gives me the idea to build a search engine that only contains content from domains that have been vouched for.

Just giving us personal blocklists would help a lot.

Then if search engines realize most people block certain websites they could also let it affect ranking.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: