Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I frequently wonder - is Facebook allowed to say, "bing, you can crawl us. NewCompetitor, you cannot."

I feel like once a company allows public access by posting stuff on the web, they can specify terms, but not include/exclude groups specifically. (In a legal sense; I understand blocking systems that hammer servers but will respect robots.txt. IME bing is the worst offender -- they hammer my sites, send no traffic, but will stop if I specify in robots.txt.)

Does anyone have an opinion about "once public, I can crawl"?



I can think of no reason why there would be any such restriction.

Suppose Facebook is getting paid by Bing, and won't offer crawling to those that aren't paying it? Suppose Facebook considers Baidu's crawler to be evil and chooses to prohibit it for that reason? Suppose Facebook just kind of likes the guys at Bing and decides to allow them special access? If you agree in the first place that Facebook should have the right to put ANY sort of restrictions on who can crawl their side, then why should ANY of these be prohibited? This is not a "common carrier" kind of situation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: