I mostly agree, however experienced a different challenge exactly for the very reason of consistency:
I used to work within the Chromium codebase (at the order of 10s of million LOC) and the parts I worked in were generally in line with Google's style guide, i.e. consistent and of decent quality. The challenge was to identify legacy patterns that shouldn't be imitated or cargo-culted for the sake of consistency.
In practice that meant having an up to date knowledge of coding standards in order to not perpetuate anti-patterns in the name of consistency.
Are you comparing tinybox red with 738 FP16 TFLOPS at $15K to Project Digits with 1 FP4 PFLOP at $3K? Or did they announce the Project Digits FP16 performance somewhere?
Going by the specs, this pretty much blows Tinybox out of the water.
For $40,000, a Tinybox pro is advertised as offering 1.36 petaflops processing and 192 GB VRAM.
For about $6,000 a pair of Nvidia Project Digits offer about a combined 2 petaflops processing and 256 GB VRAM.
The market segment for Tinybox always seemed to be people that were somewhat price-insensitive, but unless Nvidia completely fumbles on execution, I struggle to think of any benefits of a Tinygrad Tinybox over an Nvidia Digits. Maybe if you absolutely, positively, need to run your OS on x86.
I'd love to see if AMD or Intel has a response to these. I'm not holding my breath.
You're right. Tinybox's 1.36 petaflops is FP16 so that is a significant difference.
Also, the Tinybox's memory bandwidth is 8064 GB/s, while the Digits seems to be around 512 GB/s, according to speculation on Reddit.
Moreover, Nvidia's announced their RTX 5090s priced at $2k, which could put downward pressure on the price of Tinybox's 4090s. So the Tinybox green or pro models might get cheaper, or they might come out with a 5090-based model.
If you're the kind of person that's ready to spend $40k on a beastly ML workstation, there's still some upside to Tinybox.
You're missing the most critical part though. Memory bandwidth. It hasn't been announced yet for Digits and it probably won't be comparable to that of dedicated GPUs.
I'm still having a hard time comparing US salaries across continents. I used to halve SF salaries to arrive at the EUR equivalent for purchasing power parity. In addition to that, median salaries are a poor summary statistic due the "Trimodal Nature of Software Engineering Salaries in the Netherlands and Europe" [0]
> The "sufficient explanation of the goals/problem" is the code—anything less is totally insufficient.
somewhat in that spirit, I like Gerald Sussman's interpretation of software development as "problem solving by debugging-almost right plans", in e.g. https://www.youtube.com/watch?v=2MYzvQ1v8Ww
> First, we want to establish the idea that a computer language is not just a way of getting a computer to perform operations, but rather that it is a novel formal medium for expressing ideas about methodology. Thus, programs must be written for people to read, and only incidentally for machines to execute.