Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's not a valid assumption for truly ambitious web applications.

For example, I have an Ember application running on both mobile and desktop that gives the user complete read/write access to their data even when the network goes down. When they get a connection again, everything synchronizes automatically.

This is an important requirement for us, and once we implemented it we realized that is has very useful side benefits as well: the entire application feels dramatically faster, because you're very rarely waiting for the server. Changes take effect instantly.

It also makes our backend infrastructure simpler. First, because we can take it down for a bit and nobody will even notice. Second, because once you have a true distributed synchronization algorithm running, adding another redundant server is no more complicated than adding another client. They're actually quite symmetric in many respects, running the same codebase.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: