Hacker Newsnew | past | comments | ask | show | jobs | submit | wyldfire's commentslogin

> the world has changed.

It's the effect of a cult of personality. People don't feel like they want or need this. But they're on board with the cult.


There's a stark difference between de jure and de facto here. Executive orders will brazen, tyrannical effects and are often reined in late or never.

"I expected an automaton to be a good source of entropy and it turns out it is not."

BTW LLM here is doing a great job of emulating humans. They are not good at this task either.

> Nine parameter combinations produced zero entropy — perfectly deterministic output

They'd need some kind of special training to go request entropy from a system entropy device. Behaving deterministically is a feature, not a bug.


Just because the LLM happens to be bad at something humans are also bad at, doesn’t mean the system is “emulating humans”.

> how incredible the human brain is compared to computers.

It is pretty incredible but people will (rightly so?) hold automated drivers to an ultra high standard. If automated driving systems cause accidents at anywhere near the human rate, it'll be outlawed pretty quickly.


> If automated driving systems cause accidents at anywhere near the human rate, it'll be outlawed pretty quickly.

This is evidently false. Robotaxi crash rates exceed human drivers', but there's not an effective regulatory agency to outlaw them!

https://futurism.com/advanced-transport/tesla-robotaxis-cras...


According to that article, Waymo crashes 2.3x more often than human drivers (every 98k miles vs 229k miles), which is clearly false. I think it's far more likely that humans don't report most minor collisions to insurance, and that both Robotaxis and Waymo are safer than human drivers on average.

> According to that article, Waymo crashes 2.3x more often than human drivers (every 98k miles vs 229k miles), which is clearly false.

Why is it clearly false? It might be false, but clearly? I would definitely like to see evidence either way.

> I think it's far more likely that humans don't report most minor collisions to insurance, and that both Robotaxis and Waymo are safer than human drivers on average.

That sounds like you are trying to find reasons to get the conclusion you want.


The NHTSA requires a report when any automated driving system hits any object at any speed, or if anything else hits the ADS vehicle resulting damage that is reasonably expected to exceed $1,000.[1] In practice, this means that everyone reports any ADS collision, since trading paint between two vehicles can result in >$1k in damage total.

If you go to the NHTSA's page regarding their Standing General Order[2] and download the CSV of all ADS incidents[3], you can filter where the reporting entity is Waymo and find 520 rows. If you filter where the vehicle was stopped or parked, you'll find 318 crashes. If you scan through the narrative column, you'll see things like a Waymo yielding to pedestrians in a crosswalk and getting rear-ended, or waiting for a red light to change and getting rear-ended, or yielding to a pickup truck that then shifted into reverse and backed into the Waymo. In other words: the majority of Waymo collisions are due to human drivers.

So either Waymos are ridiculously unlucky, or when these sorts of things happen between two human driven cars, it's rarely reported to insurance. In my experience, if there's only minor damage, both parties exchange contact info and don't involve the authorities. Maybe one compensates the other for damage, or maybe neither party cares enough about a minor dent or scrape to deal with it. I've done this when someone rear-ended me, and I know my parents have done it when they've had collisions.

If human driven vehicles really did average 229k miles between any collision of any kind, we'd see many more pristine older vehicles. But if you pay attention to other cars on the road or in parking lots, you'll see far more dents and scratches than would be expected from that statistic. And that's not even counting the damage that gets repaired!

1. See page 13 of https://www.nhtsa.gov/sites/nhtsa.gov/files/2025-04/third-am...

2. https://www.nhtsa.gov/laws-regulations/standing-general-orde...

3. https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_In...


Definitely. I looked at Tesla's source for these numbers, looks like they primarily used data sourced from police reports, which most people only file if the incident is serious enough to turn into insurance.

Tesla notes:

> These assumptions may contain limitations with respect to reporting criteria, unreported incident estimations (e.g., NHTSA estimates that 60% of property damage-only crashes and 32% of injury crashes are not reported to police

https://www.tesla.com/fsd/safety


A length could refer to lots of different units - elements, pages, sectors, blocks, N-aligned bytes, kbytes, characters, etc.

Always good to qualify your identifiers with units IMO (or types that reflect units).


You need to annotate your program with indications of what variable tracks the size of the allocation. So, sure, but first work on the packages in the distro.

Note that corresponding checks for C++ library containers can be enabled without modifying the source. Google measured some very small overhead (< 0.5% IIRC) so they turned it on in production. But I'd expect an OS distro to be mostly C.

[1] https://libcxx.llvm.org/Hardening.html


Maybe they're greedy or maybe they see the long game is that their architecture licensing business is in serious jeopardy from RISCV. So, if you can't beat em, join em.

Maybe they'll eventually make their own RV core designs too.


> Maybe they'll eventually make their own RV core designs too.

I am not a deeply technical embedded person, but I actually don't think that would be the death of ARM: my understanding is that they develop a lot of SoC-level interconnect/fabric standards and IP as well. After all, you have to do a lot of work to integrate your ARM cores into an actual platform...


The problem is they go from being at the center of everything outside x86-64 to just another RISC-V provider. And there will be dozens. And the market will not care so much as they succeed and fail as the ecosystem will not depend on the suppliers specifically. How does ARM stay at the top of that dog fight? It is a much bigger challenge than they have faced so far.


The problem for ARM is that there are a dozen RISC-V companies implementing their business model.

You license ARM cores because you want a “custom” chip but do not want to start from scratch. You especially do not want to have to bootstrap a software ecosystem. When ARM had no competition, it is just a question of which ARM core you want.

Now, you can get the same thing from any RISC-V design house. Which means having real choice over the features you want. If ARM is just one of those RISC-V shops, how does ARM compete? By being the best? Not likely.

And, in the past, you could not totally outgrow ARM as they own the ISA. The Qualcomm lawsuit was an attempt to maintain tight control over this. With RISC-V, you can pack up and move your whole ecosystem elsewhere including taking it entirely in-house. This includes the ISA to an extent since anybody can add extensions.

Today, we are seeing RISC-V succeed where this flexibility matters most: in microcontrollers and in AI.

But as performance equalizes, volumes go up and costs come down, the use cases where ARM makes more sense dwindle.

That makes backwards compatibility the last real reason to use ARM. But does this matter on mobile where devices download the apps that match their arch? Not really. Does it matter in most embedded cases? Not really. Does it even matter in the server? More, but even there not as much as it used to. Does it matter for anything mostly GPU or NPU driven? No. So that leaves desktop and laptop. And, outside of Apple, ARM has not really built up anything to stay compatible with. RISC-V may have time to grow into that niche before being blocked.

We are going to exit 2026 with RISC-V chips that are fast enough. How fast will the costs come down? Perhaps a year or two?

What markets is ARM well positioned to continue its dominance in?


Qualcomm acquired talented designers and put them to work, not their existing (further encumbered) designs.


More robust in that it tolerates individualistic workers? Interesting.


More robust in that if everyone just does their jobs in accordance with process you potentially create process traps and settle into bad local maximums.

You need some number of people with ego to tell you what they really think, be resistant to things they see as bad, etc, etc. Otherwise you will waste untold sums in the time it takes to realize your mistakes they would have told you about a week after you rolled them out.


> Who is the buyer?

Who do you know who is currently sitting in a seat of massive power in the US Government, watches TV and says things like, "I need to have that! Why do we not have that already? It will project strength, and all the best governments project strength at every opportunity!"


Pretty sure trump knows about the f35 already


"How many of those did we order? Let's get some more. Super-size me!"


"And they should be called Trump Jets, and have gold trim on them! And be on a memecoin with my name on it".


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: