My current machine is an i5-3570k with a 1070Ti...
The old CPU is actually more of an issue. I couldn't run Civ 7 because the game (probably the DRM) uses some instructions that aren't implemented on that CPU. Other than that I bet it would run just fine.
I was just about to upgrade before hardware prices went through the roof. Now I'm just holding on until some semblance of sanity returns, hoping every day that the bubble pops and loads of gently loved hardware starts appearing on the secondary market. Also, the way nVidia has been skimping on memory for all but the most outrageously expensive chips has grated on me. I was really hoping they would buck the trend with the 5xxx generation, but nope, and with RAM prices the way they are I have little hope for the 6xxx generation. My current card is close to a decade old and has 8GB of VRAM. I'm not upgrading to a card with 8GB of VRAM, or ever 12GB. That 8GB was crucial in future proofing the original card, none of its 4GB contemporaries are of much use today.
They were probably forced to update when they dropped older busses. Without a PCI or AGP bus on there they have to find something that can hang off of a PCIe lane.
I remember having a ton of servers with cut down Mach64 chips. They were so bad that you would get horizontal lines flickering across the screen while text was scrolling in an 80x25 text console. I don't know why server manufacturers go to so much effort to make the console as terrible as possible. Are they nostalgic for the 8 bit ISA graphics from the original 5150? They seem offended at the idea that someone might hook a crash cart directly up to their precious hardware.
Matrox was really halfhearted with game support. They seemed far more interested in corporate customers, advertising heavily stuff like "VR" conference calls that nobody wants. They were early with multi-monitor support back when monitors were big, heavy, and expensive. I had a G200 that was the last video card I've ever seen where you could expand the VRAM by slotting in a SODIMM. It also had composite out so you could hook it to a TV. I played a lot of games on it up until Return to Castle Wolfenstein, which was almost playable but the low res textures looked real bad and the framerate would precipitously drop at critical times like when a bunch of Nazis rushed into the room and started shooting.
Last time I saw a Matrox chip it was on a server, and somehow they had cut it down even more than the one I had used over a decade earlier. As I recall it couldn't handle a framebuffer larger than 800x600, which was sometimes a problem when people wanted to install and configure Windows Server.
I mean it makes sense if you were just forced to implement an expensive waste management system and your competitor gets to just dump the stuff on the ground in a National Park. I would complain too.
It doesn't make sense if you were forced to implement waste management because you did it poorly to start with and your competitor found a smart way to do it for cheap.
No, that's more than napkin math but I feel the numbers stand for themselves that we can't really do better than decades. A few km/s won't change that.
Those shitty modems were infamous. IIRC they were also the sound card on the box and had serious issues with interrupt conflicts. It took three wizards and a dead chicken to get Doom to run stably in an online deathmatch.
Historically panels are generally considered to be exhausted after 30 years of service, although even that means they're down to 80% of their original capacity.
The more failure prone component is the inverter, by a huge margin.
That is a problem. The only way it could be worse is if your technology required a constant supply of input from a foreign country...
From a geopolitical standpoint running a country on locally produced renewable power is obviously the least risky approach, even if you get cut off from further expansion of your renewable production.
The old CPU is actually more of an issue. I couldn't run Civ 7 because the game (probably the DRM) uses some instructions that aren't implemented on that CPU. Other than that I bet it would run just fine.
I was just about to upgrade before hardware prices went through the roof. Now I'm just holding on until some semblance of sanity returns, hoping every day that the bubble pops and loads of gently loved hardware starts appearing on the secondary market. Also, the way nVidia has been skimping on memory for all but the most outrageously expensive chips has grated on me. I was really hoping they would buck the trend with the 5xxx generation, but nope, and with RAM prices the way they are I have little hope for the 6xxx generation. My current card is close to a decade old and has 8GB of VRAM. I'm not upgrading to a card with 8GB of VRAM, or ever 12GB. That 8GB was crucial in future proofing the original card, none of its 4GB contemporaries are of much use today.
reply