Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The problem isn't 500 MB disk footprint, it's all the RAM memory going to waste when loading in redundant libraries. Chrome already is a memory hog on its own, imagine all applications suddenly bringing in their versions of their libraries.


Also the network requirements to install and update such large containers is a problem, a lot of people don't appreciate how slow the internet is in most of the world compared to those with access to any kind of fiber end point.. this extends to similar concepts like snap packages.


RAM is the same situation as disk for me — I have plenty to waste.

Again, not advocating Chrome in a container, I don’t even run Chrome outside of a container. I just think it’s odd to get hung up on these sorts of resource requirements given the state of computing.


You may have plenty of resources but there are many that have to "make due" with 8 GB, 4 GB or even less RAM.


Totally fair, but I think the Venn diagram of folks that use Docker and folks that have 4GB of RAM is pretty slim. If the resources are that limited, Docker might not be the best choice for running anything.

Related: Chrome on 4GB of RAM sounds painful. Thoughts and prayers to those folks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: