Hacker Newsnew | past | comments | ask | show | jobs | submit | cellis's commentslogin

A 16 year old has been training for almost 16 years to drive a car. I would argue the opposite: Waymo’s / Specific AIs need far less data than humans. Humans can generalize their training, but they definitely need a LOT of training!


When humans, or dogs or cats for that matter, react to novel situations they encounter, when they appear to generalize or synthesize prior diverse experience into a novel reaction, that new experience and new reaction feeds directly back into their mental model and alters it on the fly. It doesn't just tack on a new memory. New experience and new information back-propagates constantly adjusting the weights and meanings of prior memories. This is a more multi-dimensional alteration than simply re-training a model to come up with a new right answer... it also exposes to the human mental model all the potential flaws in all the previous answers which may have been sufficiently correct before.

This is why, for example, a 30 year old can lose control of a car on an icy road and then suddenly, in the span of half a second before crashing, remember a time they intentionally drifted a car on the street when they were 16 and reflect on how stupid they were. In the human or animal mental model, all events are recalled by other things, and all are constantly adapting, even adapting past things.

The tokens we take in and process are not words, nor spatial artifacts. We read a whole model as a token, and our output is a vector of weighted models that we somewhat trust and somewhat discard. Meeting a new person, you will compare all their apparent models to the ones you know: Facial models, audio models, language models, political models. You ingest their vector of models as tokens and attempt to compare them to your own existing ones, while updating yours at the same time. Only once our thoughts have arranged those competing models we hold in some kind of hierarchy do we poll those models for which ones are appropriate to synthesize words or actions from.


In a word, JEPA?


No. Not at all like that. I said:

>> nor spatial artifacts

I meant visual patterns, too. You're thinking about what I said on too granular a level. JEPA is visual, based ultimately on pixels. The tokens may be digested from pixels until they're as large as whole recognizable objects, but the tokens are not whole mental models themselves.

Here's an example of humans evaluating competing mental models as tokens: You see a car, it's white, it's got some blood stains on the door, and it's traveling towards a red light at 90 miles an hour in a 30 mph residential zone, while you're about to make a left turn. A human foot is dangling from the trunk.

You refer to several mental models you have about high speed chases, drug cartels in the area, murders, etc. You compare these models to determine the next action the car might take.

What were the tokens in this scenario? The color of the car, the pixels of blood, the speed, the traffic pattern? Or whole models of understanding behavior where you had to choose between a normal driver's behavior and that of someone with a dead body fleeing a crime scene?


No 16 year old has practiced driving a car for 16 years.


They were practicing object recognition, movement tracking and prediction, self-localisation, visual odometry fused with porpiroception and the vestibular system, and movement controls for 16 years before they even sit behind a steering wheel though.


If you see gaining fine motor control, understanding pictographic language […] as a prerequisite to driving a car, then yes, all of them are


That's an exaggeration. Nobody is trained to read STOP signs for 16 years, a few months top. And Waymo doesn't need to coordinate a four-limbed, 20-digited, one-headed body to operate a car.


Well, I also think that there is a lot that we process 'in background' and learn on beforehand in order to learn how to drive and then drive. I think the most 'fair' would be to figure out absolute lowest age of kids that would allow them to perform well on streets behind steering wheel.


i am not making a point that it is, I am rather expanding on the possible perspective in which 16 years of training produce a human driver.

That being said, you don't really need training to understand a STOP sign by the time you are required to, its pretty damn clear, it being one of the simpler signs.

But you do get a lot of "cultural training" so to speak.


Do you only support USDC?


Yes, USDC on Base for now. The architecture is token-agnostic though — adding USDT or DAI is mostly config, not a rewrite. Base was the starting point because of low fees and Circle's native USDC support. Which token/chain would be most useful for you?


79. I feel like i should have done better but got stuck in a local minima of "farm animals, which obvious farm animals haven't I said??", then tried thinking of names of fish which worked until it didn't.


The “hard enough” tasks are all behind IP walls. If it’s a “hard enough” that generally means it’s a commercial problem likely involving disparate workflows and requiring a real human who probably isn’t a) inclined and/or b) permitted, to publish the task. The incentives are aligned to capture all value from solving that task as long as possible and only then publish.


I solve plenty of hard problems as a hobby


I've also used em-dashes since before chatgpt but not on HN -- because a double dash is easier to type. However in my notes app they're everywhere, because Mac autoconverts double dashes to em-dashes.


While directionally correct, the article spends a lot of time glorifying jquery and not enough on what a horrible, no good, unoptimized mess of a framework jquery was, and by extension what kinds of websites were built back then. I remember those times well. The reason to use React isn't because it was new, far from it. It was because it won vs. Ember, Angular, et. al. in 2014-2015? as the best abstraction because it was easiest to reason about. It still wasn't great. In fact, still isn't great. But it's the best blend of many leaky abstractions we use to code against the browser apis.


jquery was an unoptimised mess? it's like 30k minimised and just bridged a bunch of functionality that browsers lacked as well as providing a generic api that let you (often) ignore per-browser implementation and testing of your code

there's no reason to blame it for the types of websites being made either, it doesn't really provide enough functionality to influence the type of site you use it on


Since when did we start using file size as a measure of efficiency or optimization?

Off the top of my head: $() CSS parsing and DOM traversal was way slower than querySelector or getElementById, both of which predate jquery by years. Every $('.my-class') created wrapped objects with overhead. Something like $('#myButton').click(fn) involved creating an intermediate object just to attach an event listener you could’ve done natively. The deeper the method chaining got the worse the performance penalty, and devs rarely cached the selectors even in tight loops. It was the PHP of Javascript, which is really saying something.

By the early-2010s most of the library was dead weight since everyone started shipping polyfills but people kept plopping down jquery-calendar like it was 2006.

(I say this as someone who has fond memories of using Jquery in 2007 to win a national competition in high school, after which I became a regular contributor for years)


> $() CSS parsing and DOM traversal was way slower than querySelector or getElementById, both of which predate jquery by years.

You have that backwards – jQuery predates querySelector by years.

The reason why getElementById is fast is because it’s a simple key lookup.


Both querySelector and querySelectorAll came well after jquery. I remember it being a big deal when browsers added support for them.


> By the early-2010s most of the library was dead weight

absolutely correct this is because a lot of the shit jquery did was good and people built it into the browser because of that

putting jquery into a site now would be insane but at the time it pushed forward the web by quite a leap


3 counts of “jquery” in the text. once again, which one of them glorifies it?



Go back far enough and everything was stolen from someone


thats so wrong


ChatGPT does not offer a clear indication that it is blatantly illegal, though the amount of legal risk is high enough that it might as well be.


ChatGPT isn't even a search engine, let alone a lawyer.


Electricians in data-center states are eating; elsewhere they are scraping by due to macro-economics.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: