I have to admit, DOS memory management is very fascinating to me as a very amateur kernel investigator. I have a book called “DOS beyond 640k” which describes all sorts of extensions people back in the 80s invented to get as much free memory as possible. The contents of course are irrelevant nowadays, but it is still interesting to read as a tech book.
Good times. Our DOS game PaybackTime 2 was only capable of using conventional memory. That was a major reason for the game really not having any proper animations for its player characters.
I set up a computer for an engineering department. It was an IBM PS/2. They wanted to run AutoCAD and Ventura Publisher, one used extended memory and the other expanded.
I ended up making batch files that swapped around autoexec.bat and config.sys files so they could run.
That's talking about the MZ signature at the start of every DOS EXE executable (and therefore every Windows EXE as they have DOS stubs), not this additional use as markers in the DOS memory management code. Which probably is also Mark Zbikowski using his initials, but doesn't seem to be confirmed.
Mmmm, flashbacks of complex sets of AUTOEXEC.BAT & CONFIG.SYS files that we'd swap in and out using batch files to support different memory configurations...
I'd done so well optimizing my conventional memory with my rig (a 486SX w/ 4, then later 16 MB of RAM), then I purchased a Media Vision Pro/ Audio Spectrum 16 card and screwed it all up.
The silly thing purported SoundBlaster compatibility but needed a TSR that, if memory serves, couldn't be loaded into upper memory for that "compatibility" to actually work. It was maddening, but I'd already spent the money. Then there was the matter of throwing away more memory for the drive for the card's onboard SCSI controller... Grr...
MEMMAKER. It was okay, but it was so invasive in modifying your CONFIG.SYS and AUTOEXEC.BAT that I never really trusted it. I preferred hand-optimizing.
But then, it was my job, it wasn't for gaming or anything. I don't play games much and I had an Acorn Archimedes at home.
I could usually get 620 kB free by hand with no problem, even with a mouse, a CD, and a network stack.
That was enough for 99% of work business apps.
Being able to get ACT! for DOS running alongside a Novell Netware client on Sony laptops won me a senior job in the City of London in about 1992. (I didn't like it and quit a few years later, after a major motorbike crash made me re-assess life priorities.)
In that job I rolled out 10base-T and desktop Windows for Workgroups 3.11. That specific version, WfWf 3.11 (and not WfWg 3.1 or Windows 3.11, which were both different) contained the first version of what became VFAT, which led the way to FAT32 and Long File Names on FAT. It was a prototype of the 32-bit driver subsystem that enabled Windows 95.
And Win95 not only made the Win3 GUI irrelevant, it made DOS memory optimisation irrelevant too.
In the same City job, I also rolled out Windows NT 3.1 in production. Of course, a decade later, that rendered Windows 9x irrelevant.
I am not a gamer and never really was, but a default config of Win95 made a lot of RAM available for DOS apps, as I recall. (And I was a serious expert in this area, 30-35 years ago.)
I used to do very basic memory optimisation on my Win95 boxes, just because I could with minimal mental effort, and then my DOS sessions had 630 kB or so free.
What I confess I did not investigate was DOSSTART.BAT and optimising what RAM was left when Windows was in "DOS mode".
I had a 286 with 1MB of RAM. It had a chips and technologies chipset that used that RAM to shadow the ROM BIOS but you could also have it put memory anywhere in the upper memory area that was free. So I too religiously optimized conventional/upper memory because that was all the memory that I had.
We did special boot disks to strip out everything but what was needed for the game, but sometimes we still couldn’t make it
One day I went to a friends house and he had like way more conventional memory in memtest! What the hell I spent hours and days getting 620kb