It all goes over my head, but, what does the distribution of values look like? e.g. for unsigned integers its completely flat, for floating point its far too many zeros, and most of the numbers are centered around 0, what do these systems end up looking like?
Let me go ahead and compute that for all halting lambda terms of length at most 33 bits. The output I got from a modified BB.lhs is (giving the normal form size and the number of terms with that normal form size):
Another comment asked for the smallest unrepresented number with 64 bit programs.
While I cannot give a definite answer there, and one may never be found, here we see that the first unrepresented normal form size for programs up to 33 bits is 83, a number with only 7 bits. Curiously, there are 7 unrepresented numbers even before the first uniquely represented number, 101.