Call Us Free 773-759-7945
user heart cart0
order Free Shipping on Orders Over $100

I tried using a dual GPU setup with Lossless Scaling - here's how it went

Back in the golden days of computing, multi-GPU setups were more than just a flex. While stacking a couple of graphics cards inside your PC wouldn’t exactly result in a linear increase in performance, you’d get a noticeable boost in frame rates.

Unfortunately, the ever-growing power requirements, lack of support from developers, and disappointingly high thermals resulted in multi-GPU configurations biting the dust. Or so you’d think. Earlier, the frame generation-cum-upscaling app Lossless Scaling added support for dual GPU setups, allowing you to use any graphics card, desktop, or iGPU, in tandem with your main rendering powerhouse. Having previously covered the app here on XDA, I knew I had to pull out my old graphics cards to try it out – and here’s a log of my tests.

Related
How SLI and CrossFire devolved from amazing technologies to dead in the water in just two decades

SLI and CrossFire were all the rage back in the day. But as single GPU setups became more prominent, these cool interfaces faded into oblivion

The Frankesnstein’s testbench for the project

I promise it looks a lot less unhinged than it sounds

A PC with an RTX 3080 Ti and a GTX 1080

Before I discuss the tests, I’ll quickly go over the specs of the PC I used for this setup. At the core of the PC lay my trusty Ryzen 5 5600X and 32GB of DDR4 memory, as that’s the most powerful configuration in my tinkering cave. GPU-wise, I went with the RTX 3080 Ti as the primary device, while the ol’ reliable GTX 1080 served as the secondary card. For the PSU, I went with a 1000W Corsair RM1000e to avoid starving the system.

Enabling Frame Generation in Lossless Scaling

Installing Lossless Scaling was as straightforward as ever, and soon, I was greeted with its familiar UI. However, the app had undergone major improvements and featured entirely new options since the last time I reviewed it. I conducted most of my tests with LSFG 3.0 (X2) as the frame generation option, while leaving the upscaling settings untouched. I also set the capture API to WGC, as it’s supposed to produce lower latency on the latest version of Windows 11. The GPU section gets a little tricky, as I had to plug separate cables from my RTX 3080 Ti and GTX 1080 into my 4K monitor. However, since I planned to offload the frame generation workloads to the latter, I set it as the Preferred GPU and made sure it appeared under the Output display section as well.

The dual GPU setup works surprisingly well

Though the cards focus on entirely different workloads

A GTX 1080 Founders Edition GPU

If you’re wondering about this primary and secondary GPU hullabaloo, it’s because Lossless Scaling doesn’t exactly split the in-game rendering tasks between the cards. Rather, the main graphics card (my RTX 3080 Ti, in this case) is entirely responsible for rendering everything in the games, while the secondary card (my GTX 1080) runs Lossless Scaling and handles the frame-generation aspect.


On paper, this should introduce more latency… and well, it sort of does that if you’ve paired a really weak iGPU with your main card. But here’s the catch: running frame generation on your primary graphics card is already pretty taxing at higher resolutions, and the situation is further exacerbated if the GPU utilization is at 100%.

Running Cyberpunk 2077 with Lossless Scaling off

Since my RTX 3080 Ti delivers roughly 56 FPS on high settings in Cyberpunk 2077, I increased some lighting, shadow, and volumetric settings to ultra while leaving ray tracing disabled. Without any upscaling voodoo, the benchmark utility in the game ran at 43 FPS, and as you’d expect, the overall experience was rather choppy. Enabling Lossless Scaling somewhat improved the smoothness, though it also introduced some microstutter every time the frame rate dipped. But the real surprise was the fact that running Lossless Scaling and Cyberpunk 2077 on my RTX 3080 Ti resulted in 31 FPS on average. Forcing Lossless Scaling to run on the GTX 1080, however, was a game-changer, as the benchmark displayed 42 FPS while also running a lot smoother than the non-frame-generated version.

The actual gameplay experience was just as impressive

It’s still frame generation, but Lossless Scaling has come a long way

Running Cyberpunk 2077 with Lossless Scaling enabled

When I first used Lossless Scaling last year, I loved the concept behind the app – and even its execution, for what it’s worth. However, my experience was marred by weird artifacts, random stutters, and a little too much blurring. Well, those issues are still present on the latest version of the app, but they’re a lot more manageable now, especially with LSFG 3.0 and dual GPU support.

Related video: Regular HDD vs Fastest SSD PCIe 5.0 Loading Games (Testing Games)

Since benchmarks are just synthetic benchmarks, I decided to run Cyberpunk 2077 for a couple of minutes in different configurations. After spending quite a bit of time in Night City, I can confirm that the actual performance was no different from the benchmark. The only discrepancy was that I’d dialed the volumetric lighting settings down to high to ensure the frame rate stayed above 40 FPS. Moreover, I capped the FPS at 40 inside RivaTuner Statistics Server for a more stable experience.

Running Baldur's Gate 3 on Uperfect UGame K118 portable monitor

Admittedly, there were a couple of micro stutters – like when I went guns blazing in Night City and ended up sinking the FPS below 30. But the overall experience was actually pretty good. After that, I gave Baldur’s Gate 3 a shot and repeated most of the tests. Once again, the dual GPU combo reigned supreme, and the GPU utilization even lowered to 89% instead of staying consistently above 98%. Repeating the experiment on Red Dead Redemption 2 and The Witcher 3: Wild Hunt (with ray-tracing enabled) netted similar results. With that, it’s time to assess the feasibility of this unorthodox setup.

So, should you use two GPUs with Lossless Scaling?

An Intel Arc A750 placed next to an ASUS Rog Strix RTX 3080 Ti and a GTX 1080 Ti

Let’s say you’re on a laptop with a solid discrete card and something at least as powerful as the GTX 1050 Ti (preferably a GTX 1060), you’re in for a good time. For argument’s sake, a gaming laptop with a high-end RTX 20/30 series GPU and a Radeon 780M iGPU will work exceedingly well. Similarly, if you’ve got a spare graphics card that can keep up with its newer sibling and have a PSU that can power both cards, a dual GPU setup makes a lot of sense, especially if you’re planning to game at 4K 144 FPS without compromising on the graphical settings.

Related
I tried gaming on a VM hosted on a Proxmox server – here’s how it went

Despite the ridiculous premise, the Proxmox-powered gaming machine worked extraordinarily well after some tweaks

But you shouldn’t forget that Lossless Scaling is all about frame generation (and upscaling, since that's the name of the app), which is a makeshift method to make the gameplay feel smoother. Plus, it can’t really beat Nvidia’s newer DLSS variants at providing the best balance between quality and performance. However, it’s still a feat that a third-party developer not only created a fully-functional frame generation algorithm but managed to get it working with practically every PC title. While I’m not very fond of relying on fake frames, I’d be lying if I said I wasn't amazed by Lossless Scaling’s potential. Considering the sheer improvements we’ve seen in just a year, I might just end up using it to enjoy slower-paced RPGs at high settings on my RTX 3080 Ti + GTX 1080 combo.

A render of the Lossless Scaling logo© Provided by XDA Developers

Lossless Scaling

With its powerful frame generation and upscaling provisions, Lossless Scaling offers an affordable way to increase your in-game FPS.

Leave a comment

Please note, comments must be approved before they are published