Are you going writing your own programming language as well? Can we call it Tolkien? Because you're making a game like J.R.R. Tolkien wrote books, and there's a reason nobody writes books the way he wrote his.
Writing your own engine is great if you want to learn how to write a game engine. Knowing how to make a game engine can be helpful when making a game, but it's not necessary to make a game. Further, if you want to learn how to make a game, it might be more worth your time to simply use an engine that already does all the things you need. That way your time and energy can be focused on making the game, which is what your goal is.
Being condescending or dismissive of tools that do everything your tools you're going out of your way to construct will have to do is... weird logic. Because the same argument goes all the way down. Why wouldn't you make your own text editor? Why wouldn't you make your own compiler? Why wouldn't you make your own kernel? Why wouldn't you make your own architecture? "If you wish to make a pie from scratch, you must first invent the universe."
The answer is: because we're human beings with limited lifespans. We must stand on the shoulders of giants to see further.
One person wrote books like JRR Tolkien. His name was JRR Tolkien, and those books are widely celebrated by millions of people as classics.
I don’t have any issue with people using an engine like Godot or Unity or RPG Maker or Unreal or anything else, but I do think that there can be value in “owning the entire stack” of a project, even if that means “reinventing the wheel”.
When I do a project involving HTTP, I could reach for Rails or something, it’s a valid enough and I certainly have done that plenty of times, but I often will work with a lower level protocol. Depending on the language I will use a more simple HTTP server thing like Axum with Rust, and other times I will go full epoll/Selector with a raw socket.
I do this for a variety of reasons, but the main one is that I can build my own framework that works in a way that I think and I don’t pull in a bunch of extra crap I don’t need. I can optimize the “hot paths” of my particular project without worrying about a one-size-fits-all you get for generic frameworks, I don’t have to worry as much about leaky abstractions, and I am intimately familiar with a much larger percentage of the codebase.
Tolkien was exceptional and dedicated his entire life to it. 99.99+% of all people do not possess such a combination of talent and focus and therefore end up having to use “shortcuts”.
> and there's a reason nobody writes books the way he wrote his
And there's a reason nobody came even close to his grandiose.
> Being condescending or dismissive of tools that do everything your tools you're going out of your way to construct will have to do is... weird logic.
They've merely pointed out that there's nothing wrong with reinventing tools, you're the one attacking them.
It sounds like you don't like programming. I am in the process of writing my own language/IDE/compiler on the side of making games, and have already written a dialect of C# with a compiler that transpiles it to legal C# for use in the meantime. I would, in fact, love to write my own OS if not for the fact that proprietary hardware vendors make it virtually impossible for anybody to create a new OS that runs on consumer hardware in the year 2026. If you gave me a trillion dollars with which to build a CPU factory, I'd jump at the chance to learn that too.
People who don't like programming, who wish to abstract it all away and "stand on the shoulders of giants"[1] without understanding anything about the giants, seem to view low-level code as a bogeyman. It doesn't take a lifetime to understand. To the contrary, I would argue that low-level code is easier to work with than working only with high-level code, because you can reason about it. The more you rely on abstractions you don't understand, the more impossible it becomes to effectively reason about anything, because your reasoning is glossing over the details that make things work. But reasoning about primitives, and the things built out of those primitives that you understand, is not actually nearly as hard as the people who just want to plop Javascript libraries together and stop thinking about it would believe.
In particular, when it comes to games, especially 2D games (which are what Godot and MonoGame are typically used for), it's really not that hard. Windows has an API for doing X, Y and Z with graphics. Linux has an API for doing X, Y, and Z for graphics. You write a wrapper that your game code calls that passes through calls to each of those APIs with an #if statement filtering for which OS you're running on. Rinse and repeat the other set of platforms, with a bit of extra finangling for API limitations on web and phone OSes. Rinse and repeat for audio, input, and font handling. It took less than a month of work for me to get a polished cross-platform system working on five platforms. Not because I'm a genius, but because it's seriously just not hard. There are a thousand tutorials and books you could pick from that will give you a rundown of exactly how to do it.
Then, for example, writing your own rudimentary 2D GUI map editor can literally be done in a day. Presumably you know how to code a main menu. Add an option to the main menu that changes the gamestate to State.MapEditor when selected. Set a keybind on this state where your arrow keys increment or decrement X/Y coordinates, a keybind to place tiles/objects, a keybind to cycle which object ID is selected, and a keybind that calls a function which serializes your map state to text and saves it to a file. A little bit more work for a moving camera viewport, but it's not that hard. Want more features, polish it more. When you fully understand the primitives your system is built with, adding new features can be done quickly and easily, because it's so easy to reason about compared to reasoning about code you've never read built with primitives you don't understand.
3D does up the difficulty level, but it's by no means unachievable, either. The content creator Tsoding is currently doing a semi-weekly challenge to build his own 3D game engine from scratch on video, and he's making great progress despite not spending that much time on it, a side project that gets a few hours a week.
The end result of all this is a codebase that is more performant, lightweight, easy to read, and very easy to extend. I think developing your own engine can actually save time in the long run (if you're willing to forego the instant gratification), because it's so easy to fix bugs and add new features when you have a complete mental map of your codebase and the primitives used to construct it. For example, I have a friend who used Godot to develop a game, and they've been plagued for months with a low percentage chance of fatal crashes on a boss that they are completely unable to identify and fix, and it's because they don't have a mental map of the engine code. It's simply not even possible for them to reason about what in the engine could be going wrong because they don't even know what the engine is actually doing.
[1] Another metaphor that is grossly mis-invoked, in my view. Do you think Isaac Newton did not understand the work of those that came before him? The great thing about giants is that by doing the hard work of exploring new concepts, they make it easier for everyone who comes after them to learn them. I think it's a bit intellectually lazy to put off the work of giants as something that should not, or even can not, be learned.
[2] "like J.R.R. Tolkien wrote books, and there's a reason nobody writes books the way he wrote his." It's a real shame more people don't, considering there has never been a fantasy work rivalling his in the nearly century since.
It sounds like you're talking about making an equivalent of Super Mario from the 80s, but modern games are in fact much more complex.
And no, just because people in the 80s enjoyed Super Mario doesn't mean it's the pinnacle of game design, and that there's no need to create anything more complex.
> It took less than a month of work for me to get a polished cross-platform system working on five platforms.
You simply don't know where the bugs and performance pitfalls are because you haven't encountered them, yet. That is especially true regarding consoles with their custom hardware and mobile devices with their abundance of cheap, often not well engineered hardware and sketchy drivers.
"Modern games" span a wide range of things. I develop solely 2D games, because I prefer 2D games over 3D games. I think that even today 2D games are more enjoyable than 3D games. That doesn't mean Super Mario Bros. That can mean Europa Universalis IV, it can mean Stardew Valley, it can mean Magic the Gathering Online, it can mean Hollow Knight, it can mean Slay the Spire, it can mean a huge variety of interesting and engaging games, none of which require 3D graphics. 2D games can be as complex as you'd like them to be, far more complex in game logic than a 3D shooter even. The more complex you'd like them to be, the easier it gets to implement them if you understand the primitives you're implementing them with. Imagine trying to optimize your data structures when you don't even know what an int32 is? There are real game developers in the world who don't know even that much. It is a great thing that off-the-shelf game engines provide a level of accessibility to allow anyone to develop games, but they do not represent the pinnacle of what can be achieved in software engineering. They are the exact opposite of it, in fact.
> You simply don't know where the bugs and performance pitfalls are because you haven't encountered them, yet.
What is your point? I profile my games and have detailed logging systems. If I or my users run into performance issues, I address them as I come across them. Understanding my codebase at a low level makes it significantly easier to dig into problems and investigate underlying root causes than anyone on Unity will ever be able to. If you use Unity, you are putting your complete faith that Unity has perfectly optimized X low-level problem away at the engine level. If they haven't, and you run into that issue in your game, you are completely fucked. I love being solely responsible for the defects in my games. That means I can fix them myself. The worst thing in the world in software development is when somebody else's fuck-up becomes your problem, and you can't fix it, so you have to implement some hacky workaround, if you can even figure out why the closed-source engine code you didn't write and can't read is behaving incorrectly to work around it in the first place. Sometimes that still happens anyways -- our hardware-OS stacks are built with tens or hundreds of millions of line of dogshit code, and you can't get around it if you want to create software for platforms people use, but you can at least remove as many dependencies on bad code you have no understanding of as possible.
You're already too late at that point, and you probably lost some players, that wanted to try your game and maybe would've even liked it.
And I'm not talking about gameplay logic bugs - I'm talking about issues caused by bad drivers or by not having intimate knowledge about the hardware.
> If you use Unity, you are putting your complete faith that Unity has perfectly optimized X low-level problem away at the engine level
Most major engines allow to bypass high-level abstractions either through scripts that access low-level systems (Unity) or by directly letting people modify the source code (Unreal Engine, Godot).
> I love being solely responsible for the defects in my games.
> by directly letting people modify the source code (Unreal Engine, Godot).
Unreal is not open source, and while Godot is, I would wager 90% of its users never even look at the source code. It very specifically attracts people who want an easy way to make games without prior expertise.
> Players do not care about that.
Users don't care about much when it comes to software quality, honestly. They accept 20 FPS, slow loading, bug-riddled games that consume +20gb ram and +100gb more disk space than necessary. They may complain about a game if it gets bad enough, but they still buy and play those games. My games are significantly more optimized than most. They aren't perfect, but they don't need to be. They don't even need to be as optimized as I have made them, it's mostly just a point of pride and making the kind of software I want to see in the world. I think the only way you lose a player on technical points is if they literally cannot boot your game, but those issues plague engine games too. I had driver issues myself crashing on boot with an UE5 game two weeks ago.
There's also a hot spot problem with databases. That's the performance problem with autoincrement integers. If you are always writing to the same page on disk, then every write has to lock the same page.
Uuidv7 is a trade off between a messy b-tree (page splits) and a write page hot spot (latch contention). It's always on the right side of the b-tree, but it's spread out more to avoid hot spots.
That still doesn't mean you should always use v7. It does reversibly encode a timestamp, and it could be used to determine the rate that ids are generated (analogous to the German tank problem). If the uuidv7 is monotonic, then it's worse for this issue.
It's the same reason we use UTF-8. It's well supported. UUIDs are well supported by most languages and storage systems. You don't have to worry about endianness or serialization. It's not a thing you have to think about. It's already been solved and optimized.
Now generate your random ID. Did you use a CSPRNG, or were your devs lazy and just used a PRNG? Are you doing that every time you're generating one of these IDs in any system that might need to communicate with your API? Or maybe they just generated one random number, and now they're adding 1 every time.
Now transfer it over a wire. Are you sure the way you're serializing it is how the remote system will deserialize it? Maybe you should use a string representation, since character transmission is a solved problem with UTF-8. OK, so who decides what that canonical representation is? How do we make it recognizable as an ID without looking like something that people should do arithmetic with?
None of these are rocket-science problems, they're just standardization issues. You build a library with your generate_id/serialize_id/deserialize_id functions that work with a wrapper type, and tell your devs to use that library. UUID libraries are exactly that, except backed by an RFC.
Of course they're not rocket science. But, the question here is, "Why don't you use random 16 bytes instead of a UUIDv4?" It's not a question about rocket science. The answer is still, "Because UUIDv4 is still a better way to do it." The UUID standard solves the second and third tier problems and knock-on effects you don't think about until you've run a system for awhile, or until you start adding multiple information systems that need to interact with the same data.
But, using UUIDv4 shouldn't be rocket science, either. UUID support should be built in to a language intended for web applications, database applications, or business applications. That's why you're using Go or C# instead of C. And Go is somewhat focused on micro-service architectures. It's going to need to serialize and deserialize objects regularly.
> Now generate your random ID. Did you use a CSPRNG, or were your devs lazy and just used a PRNG?
There's nothing about UUIDs that need to make them cryptographically secure. Many programming language libraries don't (and some explicitly recommend against using them if you need cryptographically strong randomness).
Not for security but to make sure you don't accidentally reuse the same seed. I've done that before when the PRNG seed was the time the application started and it turns out you can run multiple instances at the same time.
128 random bits in some random format aren't a uuid. 0.2ml of water isn't a raindrop. If I say "you can provide me with a uuid" and you give me a base64-encoded string, it's getting rejected by validation. If I say "this text needs to be a Unicode string" and you give me a base64-encoded Unicode string's byte array, it's not going to go well.
Why are you implying that converting from base64 to and from standard UUID representation (hyphen-delimited hexadecimal) is more than a trivial operation? Either client or server can do this at any point.
Does Postgres not truly support UUID because it internally represents it as 128 bits instead of a huge number of encoded bytes in the standard representation? Of course not.
Ah, here we are. If it's just bytes, why store it as a string? Sixteen bytes is just a 128-bit integer, don't waste the space. So now the DB needs to know how to convert your string back to an integer. And back to a string when you ask for it.
"Well why not just keep it as an integer?"
Sure, in which base? With leading zeroes as padding?
But now you also need to handle this in JavaScript, where you have to know to deserialize it to a Bigint or Buffer (or Uint8Array).
UUIDs just mean you don't need to do any of this crap yourself. It's already there and it already works. Everything everywhere speaks the same UUIDs.
You have to generate random bytes with sufficient entropy to avoid collisions and you have to have a consistent way to serialize it to a string. There's already a standard for this, it's called UUID.
It’s really not that complicated a problem. Don’t worry, you’ll certainly be able to solve all the problems yourself as you encounter them. What you end up with will be functionally equivalent to a proper UUID and will only have cost you man-months of pain, but then you will be able to truly understand the benefit of not spending your effort on easy problems that someone solved before you.
It's not a huge problem. Uuid adds convenience over reinventing that wheel everywhere. And some of those wheels would use the wrong random or hash or encoding.
Yeah, I thought it was a strange comment, too. v7 is great when you explicitly need monotonicity, but encoded timestamps can expose information about your system. v4 is still very valid.
While the uuid package is actively maintained, it hasn't had a release since 2024. Indeed, there's an open issue from June 2025 asking about it: https://github.com/google/uuid/issues/194
I’m not sure of the state of that particular library, but yes, the RFC has changed significantly. For instance, the UUIDv7 format changed from the earlier draft RFC resulting in incompatibilities.
This is an example of an unmaintained UUID library in a similar situation that is currently causing incompatibilities because they implemented the draft spec. and didn’t update when the RFC changed:
Any Python developer using the uuid7 library is getting something that is incompatible with the UUIDv7 specification and other UUIDv7 implementations as a result. Developers who use the stdlib uuid package in Python 3.14+ and uuid7 as a fallback in older versions are getting different, incompatible behaviour depending upon which version of Python they are running.
This can manifest itself as a developer using UUIDv7 for its time-ordered property, deploying with Python <=3.13, upgrading to Python 3.14+ and discovering that all their data created with Python 3.13 sorts incorrectly when mixed with data created with Python 3.14+.
A UUID library that is not receiving updates is quite possibly badly broken and definitely warrants suspicion and closer inspection.
The problem is not that it is a draft RFC, the problem is that the library is unmaintained with an unresponsive developer who is squatting the uuid7 package name. It’s the top hit for Python developers who want to use UUIDv7 for Python 3.13 and below.
Your point is completely invalidated by useless name calling. The people behind cargo are clearly accomplished and serious individuals, and even if you disagree with some of the choices, calling them bozos makes your whole argument unconvincing.
RFC changes aside, the go community has been bit by unmaintained UUID libraries with security issues. Consider https://github.com/satori/go.uuid/issues/123 as a popular example.
The open issue in Google's repo about the package being malicious is not a good look. The community concluded it's a false positive. If the repo was maintained they'd confirm this and close the issue.
Maintaince is much more than RFC compliance, although the project hasn't met that bar either.
If the library just existed as a correct implementation of the RFC without bugs or significant missing features, that would be one thing. But leaving features and bug fixes already committed to the repository unreleased for years because the maintainer hasn't cut a new release since 2024 is a bad sign.
I would be more critical of Microsoft choosing to support UCS-2/UTF-16 if Microsoft hadn't completed their implementation of Unicode support in the 90s and then been pretty consistent with it.
Meanwhile Linux had a years long blowout in the early 2000s over switching to UTF-8 from Latin-1. And you can still encounter Linux programs that choke on UTF-8 text files or multi-byte characters 30 years later (`tr` being the one I can think of offhand). AFAIK, a shebang is still incompatible with a UTF-8 byte order mark. Yes, the UTF-8 BOM is both optional and unnecessary, but it's also explicitly allowed by the spec.
Human operators were not required of The Bell Telephone Company by law. Bell switched to mechanical switching stations as soon as doing so was economically advantageous.
(Reconsider my post. I'm arguing for no regulation.)
Feels more like you don’t understand the concept of the tragedy of the commons.
EDIT: Sorry, I’ve had a shitty day and that wasn’t a helpful comment at all. I should’ve said that as I understand it TOTC primarily relates to finite resources, so I don’t think it applies here. Sorry again for being a dick.
Writing your own engine is great if you want to learn how to write a game engine. Knowing how to make a game engine can be helpful when making a game, but it's not necessary to make a game. Further, if you want to learn how to make a game, it might be more worth your time to simply use an engine that already does all the things you need. That way your time and energy can be focused on making the game, which is what your goal is.
Being condescending or dismissive of tools that do everything your tools you're going out of your way to construct will have to do is... weird logic. Because the same argument goes all the way down. Why wouldn't you make your own text editor? Why wouldn't you make your own compiler? Why wouldn't you make your own kernel? Why wouldn't you make your own architecture? "If you wish to make a pie from scratch, you must first invent the universe."
The answer is: because we're human beings with limited lifespans. We must stand on the shoulders of giants to see further.
reply