Hacker Newsnew | past | comments | ask | show | jobs | submit | dijit's commentslogin

there is already so much “space stuff” that launching spacecraft is increasingly difficult.

The next comment will be “but they can have short orbits” but that betrays the fact that they can collide with other objects and if its so cheap we will launch thousands for bandwidth.

As always: technical solutions to political problems is a band-aid and makes everything worse, lets beat our politicians to death (metaphorically) instead.



I said the same thing to myself.

But then I remembered that what AWS gives you is the same generation of CPU, just obfuscated.

GCP Also obfuscates it, but not as much: https://docs.cloud.google.com/compute/docs/general-purpose-m...

(note: skylake is 10 years old)


I'd make an argument for the idea that Siri (like the touch-based keyboard) has actually regressed.

I recall scoffing at the idea of a touchscreen based keyboard due to how close the buttons were already on my Nokia E63- and contending that without the tactile differentiation I wouldn't be able to effectively type.

Even on a 4" screen, it worked wonderfully, almost like magic.

Nowadays I'm forced to have a much larger screen and yet I make significantly more mistakes than I used to.

The same is true for Siri, things like "Hows the weather outside" and "I'm thinking of going for a walk" would bring back results like "It's currently raining" or "Bring a jacket, it will be cold".

There were also all kinds of fun interactions hidden in there.

Now it's "if you ask me again on your computer"; or "I found some web results".

My Homepod insists that it doesn't know where I live too, despite it being listed in the settings. And I can't ask it about my environment unless I use VERY specific wording despite it having a digital termometer built-in (based on the fact I can see it in my Home app).

Idk, something is fucky in the land of "once great" software from Apple.


I personally suspect that it’s because they had a huge hiring surge, in the last decade; mostly brogrammers.

We are now at “second generation” brogrammers, where the initial bunch are interviewing and hiring the next bunch, being careful to select jargonauts that don’t make them feel uncomfortable. They have also established the corporate culture.

Been happening in lots of companies. It’s just more jarring, with Apple, because we expect more from them.


There’s a palpable assault on expertise afoot in the Anglosphere. It’s been going on for decades, at least since the rise of the counterculture in the 1960s, but what’s new is how pervasive it feels. Even software companies, once the nerdiest of institutions, would now rather fail to produce functioning software than identify and cultivate expertise. Ten years ago, we, or at least I, failed to recognize “nerds are cool now” as the cultural trojan horse it was. Nerds, experts, were never going to be cool; the cool kids saw money and power accumulating around nerds, and they muscled their way in.

Do any long time apple insiders here know why apple's software has gotten so much worse over the last 5+ years

Reminds me of the old adage: your most bitter employee is the person who was most full of hope.

Windows developers (like sysadmins) are of two kinds in my experience.

People who don't understand shit about how the system behaves and are comfortable with that. "I install a package, I hit the button, it works"

.. and

People who understand very deeply how computers work, and genuinely enjoy features of the NT Kernel, like IOCP and the performance counters they offer to userland.

What's weird to me is that the competence is bimodal; you're either in the first camp or the second. With Linux (+BSD/Solaris etc;) it's a lot more of a spectrum.

I've never understood exactly why this is, but it's consistent. There's no "middle-good" Windows developer.


The (install package, press button, it works) is great when you just want a boring OS since the interest is elsewhere rather than an itch on making the machine as perfect extension of onself.

The machine and installation is just fungible.

I think I've had Linux as a primary OS 2 times, FreeBSD once and osX once, what's pulled me back has been software and fiddling.

I'm on the verge of giving Linux or osX another shot though, some friends has claimed that fiddling is virtually gone on Linux these days and Wine also seems more than capable now to handle the software that bought me back.

But also, much of the software is available outside of Windows today.


Probably bc, Windows users live in walled knowledge domains that tend to reinforce levels of competence (or lack of competence).

Gamers tend to be somewhere in the middle though.


Unix is easier to understand than the NT mess and everything it's in the open and documented, so you can achieve a good level of knowledge in the middle. OTOH in order to understand NT deeply you must be a reverse engineer. Also, on the other side, crazy experts under Wine (both ways, Unix and NT) OpenBSD and 9front do exist on par of these NT wizards. It just happen with Unix/9f you climb an almost flat slope (more in the second) due to the crazy simple design, while with NT the knowledge it's damn expensive to earn.

With 9front you OFC need expertise on par of NT but without far less efforth. The books (9intro), the papers, CSP for concurrency... it's all there, there's no magic, you don't need ollyDBG or an NT object explorer to understand OLE and COM for instance.

RE 9front? Maybe on issues while debugging, because the rest it's at /sys/src, and if something happens you just point Acid under Acme to go straight to the offending source line. The man pages cover everything. Drivers are 200x smaller and more understandable than both NT and Unix. Meanwhile to do that under NT you must almost be able to design an ISA by yourself and some trivial compiler/interpreter/OS for it, because there's no open code for anything. And no, Wine is not a reference, but a reimplementation.


That's kinda true for older/integrated parts of Windows, lots and lots of functionality that people have come to rely on over the years, but also huge black-boxes that you need to not be intimidated at probing into to solve weird issues (that often becomes understandable if you have enough experience as a developer to interpret what the API surface tells about the possible internal implementation).

Yeah, I have a similar situation; FreeBSD is a great operating system, but the sheer amount of investment in Linux makes all the warts semi-tolerable.

I'm sure some people have a sunk-cost feeling with Linux and will get defensive of this, but ironically this was exactly the argument I had heard 20 years ago - and I was defensive about it myself then.. This has only become more true though.

It's really hard to argue against Linux when even architecturally poor decisions are papered over by sheer force of will and investment; so in a day-to-day context Linux is often the happy path even though the UX of FreeBSD is more consistent over time.


> even Microsoft has abandoned it for new apps.

Ok, all other things being equal: Microsoft is no longer a good arbiter of UI/UX design.

This is extremely well documented.

Old doesn't automatically mean worse, though I understand that people feel that way on an emotional level when they see old "ugly" UI.



Linkedin has been breached a lot over time.

But I have such low faith in the platform that I would readily believe that once they think you're not going to continue adding value, they find unpleasant ways to extract the last bit of value that they reserve only for "ex"-users.


> Linkedin has been breached a lot over time.

Yeah but the OP got spam within hours. That would be pretty unlikely to have coincided with a breach.

But LinkedIn probably sold the data, they have a dark pattern maze of privacy settings and most default to ON.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: