Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I get your point and I don't have an objective answer to it. We believe that internet is an open medium and there's immense value for humankind waiting to be discovered and unlocked in all its data. After all, many of the big tech companies in the world utilize web scraping heavily.

Rate limits can be applied for different reasons. If they protect the website from being overloaded, they are good in our opinion. If they protect it from competition, research or building new non-competitive, but valuable products that are not harmful to the original website, they are not ideal.

We leave that to the user to decide the ethics of their project and just provide the tools.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: