|
||||||||||||||||||||||||||
|
Death of x86Originally posted on June 13th 2024 by Gethyn “Xylemon” ThomasQuail You know it’s funny, I was thinking about writing this for a while now, and then Microsoft made a special recent announcement and well, I knew it was time. Yes, I think x86 is finally starting to kick the bucket as it should. It has been for sometime, but at an incredibly slow pace. However, with some recent events in the last decade, I think the industry is finally going to start ramping up efforts to move beyond x86, and honestly, good riddance. What’s the big deal? A
friend and I were talking the other day about how modern
operating systems have all sorts of evil hacks
in place to make sure old junk runs like it should. We are being held back by including so much legacy cruft not just in our operating systems - but our hardware. That’s part of why they’re so big, expensive, and waste so much power. They can be built more efficiently, it’s just that since then programs have tried to take advantage of x86’s cacophony of instructions and so you’ll run into knives rather quickly. Surely there must be some compelling reasons x86 has stuck around right? Why x86 chugged alongWhile it’s easy to be cynical and say x86 has only been so dominant because the big players who produce the hardware want to milk it as long as they can, it’s only slightly true in the grand scheme of things. Sure Intel/AMD has a ton of awful patents, install bases and financial interest in x86, but ultimately with CPUs we have to consider:
This was the
reality for a long time in the 90s to even early
10s. With tested and desired platforms, and the push for native apps
instead of WebApps, we had developers learning to write for
different hardware from their own for the first time in ages. But it’s still x86 WorldWalk into any household in the world that has a desktop or laptop and odds are it is an x86 based machine. Probably every “PC gamer” out there has x86 hardware. This is of course because every software in that market is released for x86, and then there’s compatibility with all the other old software. This has been Microsoft Windows its biggest selling point, that older apps will always “work”. It’s probably why in part their original RISC effort, Windows ARM. failed with the Surface tablets. Most people use Windows because old stuff loads and it’s all they ever knew, but even with that the cracks are starting to show.
More programs every year fail to run on Windows, with
some even now using DirectX to Vulkan
layers on there, or simply choosing to run
Wine/Proton (UNIX’s revenge indeed). RISC-V recently happened, creating a powerful open base for other
RISC CPU manufacturers to utilize. In a somewhat surprising move, Microsoft announced they have renewed interested in not just Windows ARM but that they want to tackle RISC based laptop and this time, they’ll have a transitional layer for x86 apps. Perhaps it will be a big failure, it won’t run well and no one will use it or gamers will hate it. I can’t see the future, but if Microsoft is serious about this, then we might finally be heading into a RISC transitional era. Looking ForwardSo, how will we run our old games and software in the future? One solution will be transitional layers, whether it be emulating Operating System calls from Windows or x86 CPU instruction sets. With patents finally expiring (and/or Microsoft willing to pay) we’ll see x86 running more-or-less at native speeds like we see on Mac OS 12 RISC machines. Worst case: we can always emulate selectively and still have an integrated experience that doesn’t reek of discomfort from the traditional “emulate a whole OS” approach. It worked for games consoles for decades. We’ll be fine. Also a more recent phenomeneon has been complete reverse engineering efforts. People rewrite code from scratch or clean up decompiled binaries for providing a true portable experience with enhancements. Editors note: We do not condone breaking copyright law in your jurisdiction, we are merely reporting that it happens. Once that is done, they’re basically in the same boat as games like
DOOM and Quake which were blessed with official,
open-source releases. These will probably be ported to everything,
forever. ConclusionWill RISC based CPU’s be the future architecture powering all devices a hundred years from now? It’s hard to say, but at least we know in thanks to smart emulators, transition layers, and open-source efforts that we will see our favorite games live long past us. And if you are using a RISC-based desktop and reading this in the far future, then you know when the stone age of computers started to finally end. Ultimately what will be the biggest challenge now is not getting old
software to run, but “the last mile”. This seems near impossible in the PC market, but someone would’ve thought the same in the early smartphone days where Blackberries and Windows CE ruled the scene. Now, they’re nowhere to be found in anyone’s pockets. Don’t worry. While the old hardware might go away, software lasts forever. If you want to truly preserve the past, you need to embrace new technology as well. For example, there are now low cost FPGA based hardware devices available which we can use to re-implement videogame consoles, or obsolete/unreleased/fantasy hardware. Imagine that one day, you too could stick a USB-drive sized component into your machine and flash a core for any old hardware of your choosing. Cycle accurate, hopefully open-source, hardware replication for soundcards, graphics cards - the list goes on. This will be possible, since people have already successfully re-implemented videogame consoles of the 1990s with great results. While we could still be working on Pentium processors, with 3dfx Voodoo graphics, the fact is that those are also no longer made. If you want to preserve these things for generations to come, help by re-implementing them in software with applications such as PCem or 86box or by re-implementing them using modern FPGA hardware. And no, you don’t necessarily have to implement a whole system. The point is that you can create and apply a custom chip design to accelerate certain tasks without spending money on extra hardware. Due to the advancements in FPGA technology coming down in price (such as the Tang Nano 20K) we might very well see modular components in our computers taking advantage of this new-found freedom. That way being able to run even a subset of specialized hardware instructions is not a problem. That’s the theory, anyway. The sooner we switch, the less work we have in front of us. |
AvP 2000 C&C Daggerfall Daikatana Descent Descent II Deus Ex Doom Doom II Doom 3 Duke 3D ETQW Gunman Half-Life Half-Life 2 Hell Denizen HeXen II Jedi Academy Jedi Knight Jedi Knight II Kingpin NZ: Portable Prey Quake Quake II Quake III Quake 4 Red Alert Red Alert 2 RtCW Rune SiN ST: Elite Force Tiberian Sun Unreal UT99 The Wastes Worms 2 Support us! |