I love The Elder Scrolls IV: Oblivion. It just turned thirteen years old. In human years, that means it’s now out of elementary school, but in the gaming world it might as well be long dead, buried, reanimated as a zombie, and then killed again.
To my dismay, it’s never had a remaster or re-release on newer hardware. Skyrim gets a million ports to everyone’s toaster and refrigerator, while Oblivion falls lost into the ether of time.
When it launched, Oblivion was demanding of computers in spite of having scaled back its visual ambitions after three years of developing for a moving hardware target. Thirteen years later though, this shouldn’t be a problem. Anything should be able to run Oblivion.
I own a Dell G5 from the middle of last year. It’s got a Max-Q GeForce 1060 GPU in it that Dell decided to run at full desktop clocks…thus negating the energy savings of the smaller form factor in the process. The 1060 is a nice little beast when the laptop is plugged in, but on battery it has to limit framerates to 30 to eek out more than 90 minutes of gaming life.
However, there’s another option for when I’m working in a coffee shop away from my power brick and need to play a video game instead of grinding away at tasks: The Intel UHD 630.
Neither Intel’s fastest nor slowest hardware, it’s quietly tucked away inside my i7–8750H CPU, ready to provide a modicum of graphical power with more battery efficiency if I’m willing to stoop to its dumb level.
In modern games this is a pointless exercise. The 630 struggles to even load some of them, and once it’s running I’m lucky to get performance at anything above the lowest visual settings. At that point my mind starts to wander back to my work.
Older games though, really shine on the device. Oblivion may have been a bit of a beast thirteen years ago, but the UHD 630 laughs in the face of older titles. Bring it on. Right?
On paper, the UHD 630 has ample power to run in the game. In the real world, it’s a total mess. If I turn on the HDR lighting option, the game will only output a blank gray screen. The mode is completely incompatible with my “modern” Intel graphics hardware.
Falling back on standard bloom lighting, everything looks good. At first glance. But then I noticed the shadows weren’t working, the textures had lost much of their shading, and the framerate would randomly jump between perfectly smooth and “hold on a second I’ll be right back.”
Even the opening cutscene, a pre-rendered video, struggled to look right for a while. It began decoding into several smaller windows instead of a full screen picture, before righting itself after about 20 seconds of rampant glitchery.
Graphics cards aren’t just hardware. Their drivers are just as important as their silicon. Games don’t magically have direct access to all the features of your video card. It’s up to the graphics drivers to act as a gateway between the game and your system, offering up the visual goodness we’ve all come to rely on. That’s why driver updates often improve performance in individual games. Real human developers have to help that hardware achieve peak performance by writing thousands of lines of custom code.
Nvidia and AMD have excelled at this for years, and each one likes to tout the efficiency of their drivers over the other. When you buy one of their cards, it’s just as much an investment in their future bespoke driver software as it is their hardware platform.
I’d encourage you strongly to stick with them if you plan on doing any sort of gaming, because Intel just doesn’t have what it takes in this department.
Intel’s graphics drivers aren’t up to the task of handling a thirteen-year-old game, and that’s kind of appalling. Popping over to the 1060, my battery life estimates were slashed in half, but suddenly I had the right visuals again. Cranked up to maximum, the Nvidia GPU effortlessly rendered the game just as the highest-end hardware would have in 2006, with plenty of power to spare. Even the colors looked better.
Here’s a shot on the UHD 630:
And here’s roughly the same spot on my 1060, with the same in-game settings:
Note the dramatic differences in shading, color, and lighting across the walls and floors. And the total lack of shading on the character in the opposite cell on the Intel chip.
Basic rendering features are just being ignored on Intel’s hardware. In a thirteen-year-old game. I know this isn’t the most exhaustive comparison in the world, but the whole game suffers from a flat, slightly washed-out look with zero shadows of any kind. After about 30 minutes on the Intel chip I was like no thank you.
“Alex, aren’t you making a mountain out of a molehill? So what if some of the shadows and shading weren’t working? It still worked!”
That’s the kind of attitude that’s always meant Intel gets crushed in the graphics space. And they’ve shown no signs of righting their course.
There’s no excuse for Intel’s handling of this except to say that it shows you that they don’t really care about graphics. They claim that they do all the time. They boast in online ads that their processing power on integrated GPUs has passed 1 teraflop, and they’re getting ready to integrate that into their newest mobile chips. A paltry sum compared even with today’s base PS4 and Xbox One consoles, but I suppose it’s a nice offering for the power efficiency they provide.
More surprisingly, Intel still claims that they’re going to release a dedicated full video card sometime in 2020. But seeing their performance struggles in this ancient game on their midrange solution, and their issues in general over the last decade-plus of integrated graphics hardware…I’m having a hard time mustering any enthusiasm.
Without drivers that are up to snuff, no amount of processing power will make a difference.
If you’re looking for an Integrated Graphics solution for your next desktop or laptop, again, I cannot recommend enough that you run away from Intel and go with one of AMD’s Ryzen APU’s instead. You’ll get direct access to AMD’s driver suite, and you’ll be able to play older games flawlessly and newer games better than you expect. Even Nvidia’s cheapest mobile GPU’s will provide a better gaming experience than Intel’s chips alone. Their MX150 is a prominent player in the budget space, and that’s just as much down to their extensive software support as it is the hardware.
In the old days it was easy to cast aside Integrated Graphics as garbage and not even think about them, but I thought that precedent had changed a year or two ago. It turns out I was wrong, at least on the Intel side. They might have solid performance in certain video rendering tasks and even some modern video games, but that just means there’s even less of an excuse for older titles not to work correctly.