Oh ya I remember how some computer pulled a windows update over a satellite connection during a research flight (aircraft). That was super expensive, wow. Now Microsoft servers are banned at the outgoing point since you couldn’t reliably stop it the computer itself and new teams with new computers come in.
I'm not letting Microsoft off the hook here, but if you have an expensive metered connection and you're trusting clients (especially a modern personal computer of any operating system type)to play nicely with bandwidth, that's 100% on you.
That's a really sorry state of things, then. There's zero trust in software now, in the literal sense. How did it get that we live in a world where you can't trust a client to enforce its own documented behavior? How did it get to be the user's fault for not using OS and hardware level measures and not the software vendor's fault when the "Automatic updates" toggle is a no-op?
MBAs/consultants hijacked the industry along with an influx of people that only consider leetcode to be sufficient for hiring. The past 10 years has been a major injunction of these people into big tech. The resulting mess is predictable, it'll get worse too which is why we need to break up these companies and allow better more efficient companies to take their place rather than letting them subsidize their failures with their monopolies.
Quite true! These last years Ive been working at these companies where we have security audits from this very non-tech people, who have no idea what actually a CVEs is besides the "Severity" level.
Way early in my career the "Security auditors" were the cyber security experts that programmed since they were children and actually read and tested the code.
In an environment where bandwidth utilization costs money I think it's a good belt-and-suspenders approach, regardless of the expected behavior of the clients, to enforce policy at the choke point between expensive and not-expensive.
(I think more networks should be built with default deny egress policies, personally. It would make data exfiltration more difficult, would make ML algorithms monitoring traffic flows have less "noise" to look thru, and would likely encourage some efficiency on the part of dependencies.)
Software design is not really my wheelhouse so I can't comment meaningfully on that, but on the networking side I can very confidently say it was a poor architecture. You simply cannot assume that all of your clients are going to be both 1) non-malicious and 2) work exactly as you think they will.
Link saturation would be one of the first things that would come to mind in this situation, and at these speeds QoS would be trivial even for cheap consumer hardware.
Well, on the software design side, there's plenty of scenarios where undocumented behavior crops up on unexpected network interruption. In the example above, Windows can even pre-download updates on metered connections during one time period, then install those updates during another. The customers really can't take the blame for that, IMO.
I think overall society has rapidly deteriorated in software quality and it is mostly because of the devaluing of software design. No one expects quality from software, everyone "understands there are bugs", and some like to take advantage of that. And so the Overton window gets pushed in the direction of "broken forever good luck holding the bag if you use it" rather than the more realistic "occasionally needs to restart IFF you hit an issue and it takes less than <10 seconds and has minimal data loss".
Fair enough, but the fact is that until fairly recently most software wouldn't even pretend to care about conserving bandwidth. I certainly would never expect a desktop OS to do this well, even if MS loves their revenue-generating "bugs."
How do you mean that? In my Linux Laptops updates never happen unless I trigger them and nothing really changes even years after the last installation. You could boot it up and use whats there and just never update.
> since you couldn’t reliably stop it the computer itself and new teams with new computers come in.
Wifi connection settings in Windows have a "metered connection" setting, which disables automatically downloading updates. I don't recall exactly when this was introduced, but I had to use it for a year while I was stuck on satellite internet. You can even set data caps and such.
Of course, it's always off by default, and I have no idea if there's any way to provision the connection via enterprise admin to default to on for a particular network (I would assume not) so you'd be stuck hoping everyone that comes in does the right thing.
Please get an active carbon filter for your air inlet. They’re not expensive and most cars only have the pollen filter version.
We found very high pollution values inside cars which in hindsight is very much not surprising due to what you noticed. A carbon filter brings it way down.
Yep. This is low hanging fruit for making constituents happy. I emailed my city councilor about this because I couldn't figure out who to ask at public works. The next morning a work truck rolled up and installed a shield on the light nearest to my house.
One of the best room I ever had was in Norway. It was at most 3 times as large as the queensized bed, had at least 5 different lighting options and a fantastic black-out curtain (this was needed above the Arctic circle) and tons of small storage places and hooks - and this includes a tiny bathroom. Everything you need, and super cozy.
reply