First off, thanks for the comment—pretty interesting stuff. One thing continues to bother me though.
> My point is that machine arithmetic preceded machine data storage.
> The concept of algorithms dates back to at least the 9th century, and probably back to Euclid.
I don't think that's the issue at hand. Special-purpose computers implement algorithms and may use arithmetic—those are not the key differentiators from general purpose computation. Perhaps they were important first steps, but clearly what e.g. Turing/Church did was something else entirely. And I'm aware Leibniz was very much concerned with the problem earlier on, but where I get a bit fuzzy is whether general purpose computers really needed the more fleshed out results of e.g. Turing (or perhaps a predecessor) in order to really build a general purpose computer—and whether that was a major limiting factor even after we had access to the necessary memory, or if we knew conceptually that what we were shooting for all along was fully general purpose computation and we were really just waiting for the hardware to be possible.
See Eckert's work at Columbia University.[1] In the early 1930s, IBM came out with punched card equipment that could multiply. Eckert managed to kludge several IBM tabulating machines into a sort of programmable computer. "The control box had settings that told the different machines what they should be doing at a given time and these gave the broad mechanical programming of the problem. The punches on the cards were used to program the specific details." This was all electromechanical, with relays, gears, and clutches, not vacuum tubes. IBM eventually turned that mess into the IBM Card Programmed Calculator. Still not enough memory for programs. Eckert went on to design many more computing machines, including the ENIAC. Finally, no moving parts, but still mostly plugboard-wired programs.
Eckert is credited with inventing delay line memory.[2] That was the first useful computer memory technology. Suddenly, everybody in number-crunching was building delay line computers - EDVAC, EDSAC, LEO, IAS, UNIVAC I...
All were stored program machines. As soon as there was memory, people started putting programs in it. That's when computing took off.
Interesting—though I'm not convinced that the IBM approach to generalizing computation would have led us to general purpose computers as we know them now. Sure, it was more general than what had come before, allowing some parameterization of behavior—but 'more general' and 'universal' may be worlds apart (I don't know enough of the details of IBMs machines to say how much though...). Or even an architecture technically capable of complete generality, but not doing it so well, may be worlds apart from an architecture based on a simple theoretical grounding of that complete generality.
Eckert it seems was inspired by Von Neumann whom he worked with on the ENIAC, and Neumann was an early appreciator of Turing's work on universal computation(from 1936)—no surprise that his eponymous architecture mirrors a universal turing machine not only in power but in form (i.e. "the concept of a computer able to store in its memory its program of activities and of modifying that program in the course of these activities"[1]).
In view of that, I'm inclined to give similar importance to Turing's theoretical work still. Maybe I'm overestimating the importance of the architectural insights it led to—but I wonder if, say, we even had the Lambda Calculus, but no Turing—would we have fully generally programmable computers as we do now (which are as general in practice as well as in theory, if that makes sense)? I'm not sure...
Eckert was building working hardware in the early 1930s, long before von Neumann was involved.
There was a performance penalty to doing only one thing at a time, the stored program way. Stored programs have a lot of overhead - fetching instructions, decoding instructions, doing instructions that didn't directly do arithmetic - that were not present in the plugboard-wired machines. Those usually did multiple operations on each cycle. When clock rates were a few kilohertz, this mattered, a lot.
One result was a split between "scientific" and "business" computers. Scientific computers were usually binary, funded by the military, and were mostly one-off machines in the early days. Business machines were decimal, had to be cost-effective, and were mass produced. The two sides finally came together with the IBM System/360. By that point, both were stored-program.
As for an "an architecture technically capable of complete generality, but not doing it so well", that was the IBM 1401, a very successful machine with a very strange architecture. The Computer Museum in Mountain View has two of them working. It was a true stored program computer, quite different from any of the scientific computers. It had a much lower cost and parts count, and ran most of America's middle sized businesses for years.
> My point is that machine arithmetic preceded machine data storage.
> The concept of algorithms dates back to at least the 9th century, and probably back to Euclid.
I don't think that's the issue at hand. Special-purpose computers implement algorithms and may use arithmetic—those are not the key differentiators from general purpose computation. Perhaps they were important first steps, but clearly what e.g. Turing/Church did was something else entirely. And I'm aware Leibniz was very much concerned with the problem earlier on, but where I get a bit fuzzy is whether general purpose computers really needed the more fleshed out results of e.g. Turing (or perhaps a predecessor) in order to really build a general purpose computer—and whether that was a major limiting factor even after we had access to the necessary memory, or if we knew conceptually that what we were shooting for all along was fully general purpose computation and we were really just waiting for the hardware to be possible.