This has been my theory for a while: during this autumn Apple will release a version of Apple Intelligence that runs locally and works better than ChatGPT. They will do this because 1) they do not have an offering in AI yet 2) they have amazing hardware that even now almost can pull it off on open models and this will not be possible to replicate on android for a long time (presumably)
This will crush OpenAI.
Note: I am not talking about coding here - it will take a while longer but when it is optimized to the bone and llms output has stabilized, you will be running that too on local hardware. Cost will come down for Claude and friends too but why pay 5 when you can have it for free?
> This has been my theory for a while: during this autumn Apple will release a version of Apple Intelligence that runs locally and works better than ChatGPT.
In this theory, can you explain why Apple has announced it’s paying Google for Gemini too?
Eventually, this may be true. This autumn? Highly unlikely.
Cool! We also build client-server sync for our local-first CMS:
https://github.com/valbuild/val
Just as your docsync, it has to both guarantee order and sync to multiple types of servers (your own computer for local dev, cloud service in prod).
Base format is rfc 6902 json patches.
Read the spec sheet and it is very similar :)
Looks really cool, I would love to use it in my DollarDeploy project. Documentation could be a bit better still, it is not clear, are content is pure markdown or it is typescript files? Which GitHub repo it synchronizes to? I prefer monorepo approach.
What sets it apart is that it 1) stores content in TS / JS files 2) is a fully fledged CMs.
It is designed to be nice to work with from the start a project (and the structure of your app changes all the time) -> when everyone works on individual PRs -> to the end when the project is decommissioned.
It needs no cloud APIs, no DBs nor caching. No query language to learn. No sign up to get started.
It is fully TypeSafe and needs no type generation. You can rename and refactor content from you IDE.
It works amazingly with Cursor and friends (local content and schema + strong typesafty + validation)
Currently reqs are: Nextjs and GitHub.
APIs are pretty stable. UI is in the process of a revamp.
Will do a proper show hn some time in the near future.
> It's entirely possible that we do become obsolete for a wide variety of programming domains. That's simply a reality…
It is not a reality since it has not happen. In the real world it has not happened.
There is no reason to believe that the current rate of progress will continue. Intelligence is not like the weaving machines. A software engineer is not a human calculator.
To be fair he didn't say it is the reality now, he said the possibility is a reality. At least that's how I read his sentence.
And yeah, I do think it's a real possibility now.
I really liked this article. I think all of these points, in particular the conclusion of cost vs benefit could be the same whether you were talking about formal methods or web apps. This way of seeing things, with an engineering perspective, is the exact same we (at least I) see it but we mostly do web apps.
This will crush OpenAI.
Note: I am not talking about coding here - it will take a while longer but when it is optimized to the bone and llms output has stabilized, you will be running that too on local hardware. Cost will come down for Claude and friends too but why pay 5 when you can have it for free?
reply