I really like the framing here (via Richard Sennett / Roland van der Vorst): craft is a relationship with the material. In software, that “material consciousness” is built by touching the system—writing code, feeling the resistance of constraints, refactoring, modeling the domain until it clicks.
If we outsource the whole “hands that think” loop to agents, we may ship faster… but we also risk losing the embodied understanding that lets us explain why something is hard, where the edges are, and how to invent a better architecture instead of accepting “computer says no.”
I hope we keep making room for “luxury software”: not in price, but in care—the Swiss-watch mentality. Clean mechanisms, legible invariants, debuggable behavior, and the joy of building something you can trust and maintain for years. Hacker News needs more of that energy.
> I hope we keep making room for “luxury software”
The risk of making the source in a compiler a black box is pretty high.
Because of your program is black box compiled with a block box, things might get really sporty.
There are many platform libraries and such that we probably do not want as black boxes, ever. That doesn't mean an AI can't assist, but that you'll still rewrite and review whatever insights you got from the AI.
If latest game or instagram app is black box nobody can decipher, who cares? If such app goes to be a billion dollar success, I'll feel sorry for engineers tasked with figuring why the computer says no.
I’m evaluating UI options for a new .NET product and I’m struggling to find balanced, experience-based pros/cons about Blazor (Server / WASM / Hybrid) compared to alternatives (React/Vue/Angular + API, MAUI/WinUI/WPF, etc.).
This post dives into that "black magic" layer, especially in the context of emerging thinking models and tools like Ollama or GPT-OSS. It’s a thoughtful look at why sampling, formatting, and standardization are not just implementation details, but core to the future of working with LLMs.
CRDTs and HLCs often feel like over-engineering. In most business/management apps, it’s rare for two people to edit the same piece of data at the exact same time.
A simpler approach works surprisingly well:
Store atomic mutations locally (Redux-like) in SQLite.
Sync those mutations to a central server that merges with last-writer-wins.
Handle the few conflicts that do occur through clear ownership rules and some UI/UX design.
This makes local-first behave more like Git: clients work offline, push their “commits,” and the server decides the truth. Most of the complexity disappears if the data model is designed with collaboration and ownership in mind.
If we outsource the whole “hands that think” loop to agents, we may ship faster… but we also risk losing the embodied understanding that lets us explain why something is hard, where the edges are, and how to invent a better architecture instead of accepting “computer says no.”
I hope we keep making room for “luxury software”: not in price, but in care—the Swiss-watch mentality. Clean mechanisms, legible invariants, debuggable behavior, and the joy of building something you can trust and maintain for years. Hacker News needs more of that energy.