I read through most of the paper we are all nominally talking about. He doesn't use normal numbers as you say. He is talking about computable numbers as you say. My apologies.
But a normal number has "random" digits in the sense that all finite sequences of digits (in a given base) occurs uniformly in the expansion. What other notion of random can one meaningfully give for an expansion of a number's digits? Without getting into too much philosophy.
From [1]: We call a real number b-normal if, qualitatively speaking, its base-b digits are “truly
random.
My expertise is in commutative algebra so I'm outside of my comfort zone.
Something can be random and not have a uniform distribution. If I throw two dice, the sum will be random, but it will not be uniformly distributed. Uniform is just one distribution among meny.
The only definition of "randomness" in a sequence that holds any kind of water, philosophically or mathematically, is Kolmogorov complexity [0] (see specifically the section on "Kolmogorov randomness"). I don't know where you got the idea that "normalness" is the ultimate version of randomness, but it's not.
Forget the formal definition for a second, and just think of the intuitive notion of randomness for a second: does the sequence 12345678.... look random to you? In random sequences there should be no patterns: do you see a pattern in this sequence? If you had a computer program with a random number generator that produced that sequence of digits, would you be happy with it? No, you wouldn't.
In the sense of Kolmogorov randomness, the sequence 1234567.... is obviously not random at all, since it's trivial to find a Turing machine to generate it. It matches up perfectly with out intuitive notion of what randomness is, and it quite correctly points out that the amount of information in the string is very low, even though it's infinitely long. That is my definition of randomness, and it's (more or less) the author of this paper's definition.
But a normal number has "random" digits in the sense that all finite sequences of digits (in a given base) occurs uniformly in the expansion. What other notion of random can one meaningfully give for an expansion of a number's digits? Without getting into too much philosophy.
From [1]: We call a real number b-normal if, qualitatively speaking, its base-b digits are “truly random.
My expertise is in commutative algebra so I'm outside of my comfort zone.
[1]: https://www.davidhbailey.com/dhbpapers/bcnormal.pdf