How SLI And CrossFire Devolved From Amazing Technologies To Dead In The Water In Just Two Decades

Key Takeaways From Voodoo2 to the RTX 4000 family, SLI and CrossFire have become relics of the past in today’s gaming landscape. Gaming technology shifted from multi-GPU setups to dual … Read more

Taylor Bell

Taylor Bell

Published on Jul 14, 2024

How SLI And CrossFire Devolved From Amazing Technologies To Dead In The Water In Just Two Decades

Key Takeaways

  • From Voodoo2 to the RTX 4000 family, SLI and CrossFire have become relics of the past in today’s gaming landscape.
  • Gaming technology shifted from multi-GPU setups to dual GPU rigs due to space, compatibility, and performance issues.
  • With Nvidia ditching SLI for NVLink and AMD deprecating CrossFire, the era of multiple graphics cards is over in 2024.

With graphics cards being the most expensive component of any gaming rig, most PC builders tend to allocate over half of their budget to the GPU. But over a decade ago, it was impossible to imagine that someday we’d end up paying more than a grand for a single mid-range GPU. Besides the pricing, things were quite different back in the day, with many enthusiasts rocking SLI and CrossFire configurations in their systems.

These are the best graphics cards you can buy

Related

Best GPUs in 2024: Our top graphics card picks

Picking the right graphics card can be difficult given the sheer number of options on the market. Here are the best graphics cards to consider.

Unfortunately, multi-GPU setups were unable to withstand the ravages of time. In fact, it’s impossible to utilize these technologies on the current-gen consumer graphics cards, since both Nvidia and AMD have long rinsed their hands off of SLI and CrossFire. But what led to such a change in the gaming landscape?

In the beginning, there was the 3dfx Voodoo2

T’was a fantastic little device that began the multi-GPU crusade

“” data-modal-id=”single-image-modal” data-modal-container-id=”single-image-modal-container” data-img-caption=””””>

An image of a 3dfx Voodoo2 GPU

Our story begins in the late 1990s, when 2D GUI acceleration had become mainstream in graphics chips, and the gaming industry was getting ready for GPUs that could combine 3D acceleration with 2D (and video) processing capabilities. When Nvidia’s RIVA 128 bagged that achievement, other manufacturers began to follow suit.

However, 3dfx, a now-defunct GPU manufacturer, implemented an amazing utility in its Voodoo2 GPU that ended up influencing many PC builds in the following years. While its capability to render graphics at 800×600 resolution was fine and all, the real game-changing feature of this card was the Scan Line Interleave (SLI) technology.

Without going into specifics, this facility allowed the GPUs to share processing-related information with each other. Once you paired two of these GPUs with the help of a ribbon cable, both Voodoo2 units would start rendering the horizontal pixel lines in an alternating sequence, essentially increasing the graphics processing capabilities of the system. When 3dfx was bought by Nvidia, Team Green also acquired the rights to the company’s products and IPs, including SLI.

The 2000s marked the golden era of SLI

Dual GTX 690 in SLI

Source: Wikipedia

After taking over 3dfx, Nvidia came up with its own implementation of SLI, which the company termed Scalable Link Interface. Debuting with the Nvidia GeForce 6 GPUs in April 2004, the rebranded SLI featured more complex algorithms involving two distinct modes: Split-frame rendering (SFR) and Alternate-frame rendering (AFR) to utilize the processing power of two GPUs in tandem.

SFR was centered around dividing each frame horizontally and sending each section to a different GPU. Meanwhile, AFR gave better performance by forcing the GPUs to take turns rendering the frames, though it was known for causing micro stutter games. Nevertheless, SLI became quite popular among enthusiasts, and the technology persisted all the way until 2012. But more on that later, as it’s time to take a look at Team Red’s side of things.

AMD, too, began implementing CrossFire

Though there were a few hiccups in the beginning

PowerColor Hellhound Radeon RX 7900 GRE Radeon branding

Just a year after Nvidia stepped into the multi-GPU territory, ATI (a firm that was later acquired by AMD) decided to challenge its rival by releasing its own version of the technology. Dubbed CrossFire, ATI’s counterpart was a bit on the meh side, at least initially. Besides requiring specific motherboards compatible with the new technology, you had to buy a Master card (no, not the credit/debit card) that featured a special DVI connector, which you’d plug into another GPU called the slave.

However, the release of the Xpress 3200 chipset removed this master-slave GPU requirement, and by the time the Radeon HD 3000 series started making the rounds, AMD had leveled the playing field for multiple GPU setups. Alas, the end of the glorious days of slotting many GPUs into PCs had come to a screeching halt.

Technological advancements led to a fall in the popularity of SLI/CrossFire

Why use many GPUs when one can do the trick?

After the release of the GTX 600 family, Team Green diverted its focus to improving the performance of single-GPU setups. Meanwhile, AMD’s GPUs began hitting astronomically high TDPs, meaning you needed a behemoth of a power supply to grant juice to one graphics card, let alone two or even four of these power-hungry beasts.

Soon, the three and four-card configurations were replaced by dual GPU rigs. Even then, there were plenty of caveats you had to consider when slotting more than one graphics card into your PC.

Firstly, there was the issue of the GPU size. By the mid-2010s, there was barely any space inside your average chassis to include multiple graphics cards without taking a hit in the thermals. As newer APIs started making the rounds, it became even more difficult for game developers to optimize their games for SLI/CrossFire setups. That’s before you include the immense driver improvements needed to prevent bugs and glitches in multi-GPU configurations.

To conclude, there was literally no reason to get a second graphics card when you could just buy its superior version without suffering from performance and compatibility issues.

While AMD quietly killed the CrossFire technology

Palit GeForce RTX 4070 Super Dual GPU

In the end, only enthusiast-grade GPUs featured SLI, with the GTX 1070, 1070 Ti, 1080, 1080 Ti, and Titan XP being the last graphics cards supporting the technology. With the release of the RTX 2000 series, Nvidia ditched SLI in favor of NVLink, and you had to purchase expensive bridges to get two GPUs to work in tandem.

By this point, the performance difference didn’t compensate for the glitches and bugs, assuming you even got better FPS in the first place. By the time Team Green revealed the Ampere lineup, there weren’t many gamers left who’d buy two GPUs for their needs. For the RTX 4000 family, Nvidia completely scrapped multiple graphics card technology.

An image showing a GeForce RTX 4070 Ti and Radeon 7700XT GPUs kept on a table next to a split keyboard.

Related

Can you use two graphics cards in a PC in 2023?

Just because you could doesn’t mean you should

AMD, on the other hand, was more subtle in its decision to pull the plug on CrossFire. Following the release of the RX Vega family, support for CrossFire was deprecated from Team Red’s GPUs, and the feature was absent in the RX 5000 series, marking the end of CrossFire.

SLI and CrossFire: Relics of the past that serve no purpose in 2024

An image showing a Zotac Gaming RTX 4070 Super GPU kept next to Gigabyte RTX 4070 Ti Gaming OC GPU for size comparison.

Currently, both SLI and CrossFire have been buried in the tech graveyard. Given the current GPU market, it doesn’t look like Team Green and Team Red will be resurrecting these technologies anytime soon.

Meanwhile, the consumer tech industry is only focused on single-GPU setups. And that’s low-key a good thing, because I shudder at the thought of the sky-high electricity bills and hardware-melting heat that dual (or god forbid, multiple) RTX 4090s could generate.

Partager cet article

Inscrivez-vous à notre newsletter