Computers are a wonderful bit of technology that can change our lives in a manner of ways, but let’s be honest, it’s not the simplest thing to wrap your head around when you’re just getting started. A combination of acronyms, numbers and other PC jargon make things a little difficult to understand, but that’s where we at Tech Advisor come in: we’ll help you understand the core components of a PC, one at a time.
Right here we outline what a GPU is, what it’s used for and some of the burning questions that people have about GPUs. For more on other components, take a look at what is a CPU?, what is RAM? and our CPU vs GPU explainer.
What does GPU stand for?
GPU stands for Graphics Processing Unit, the component tasked with powering the graphics on a PC, with component manufacturer Nvidia coining the term “GPU” during the 1990s. Nowadays, it’s odd to refer to the GPU by its full name instead of the acronym – it is a lot quicker to say and easier to type, after all.
Integrated vs discrete GPU
Most modern CPUs come with a built-in, otherwise known as integrated, GPU. Traditionally, integrated GPUs are used to display Windows on a connected screen, and they’re more than enough to play videos, look at images and display complex websites, but they’re not quite up to the task of powering more graphically demanding software like games and video editing software.
That’s not quite the case with more modern integrated graphics though, with Intel’s UHD integrated GPUs offering decent graphics performance without the need for a big, bulky graphics card, helping keep performance up in even lightweight laptops. The 11th-gen Intel chipset’s integrated GPU is capable of hitting [email protected] at medium quality in Rocket League – something that wouldn’t have been possible with an integrated GPU only a few years ago.
When it comes to more advanced graphical tasks, you’ll need a discrete GPU, usually found in high-end desktop PCs. That’s where the two main players, AMD and Nvidia, come into play.
Dedicated GPUs from the likes of AMD and Nvidia are completely separate to the CPU and connect to the motherboard directly via PCIe slots. They sport their own dedicated memory allocation for graphical operations exclusively, allowing them to handle complex graphical applications with ease, and if you still need more power, some high-end PCs will allow you to use two GPUs side-by-side. That’s arguably overkill for a standard gaming PC, but ideal for enterprise solutions.
Dedicated GPUs should be your go-to if you’re doing anything graphically demanding like rendering a 3D CAD, editing 360-degree video or playing the latest AAA games in ultra-quality at [email protected] If you’re happy playing the occasional game and want to focus more on productivity, a modern integrated GPU should do the job.
What’s ray tracing technology?
Ray tracing technology, found on more recent Nvidia RTX 30 Series and AMD RX 6000 Series GPUs, is dedicated rendering technique to realistically trace the path of light, simulating light sources in a true-to-life fashion to create environments and objects that look more authentic. It’ll also simulate the way light interacts with virtual objects it touches and how that, in turn, interacts with the world. It’s all very impressive stuff.
The technology can vary depending on the application, but the most popular use for ray tracing right now is gaming.
Ray-traced games look more realistic and provide more immersive environments for gamers to explore, whether it be a single beam of light shining through the boarded-up windows of an abandoned home you’re exploring or neon lighting reflecting perfectly off the shiny metallic walls of a futuristic city you’re blasting your way through. It’s a transformative experience, and one that makes more of a difference than simply boosting textures or resolution.
Minecraft RTX is a perfect example of how much of a difference ray tracing technology can make to a game. To find out more, take a look at our Minecraft RTX hands-on.
Are GPUs only needed to play games?
While discrete GPUs are mainly used to power graphically intense programs like games, photo and video editing software and the like, the use of GPUs is ever-expanding.
Due to the GPUs ability to perform parallel operations on multiple sets of data, scientists and engineers are increasingly turning to GPUs for scientific and AI-powered applications. That’s partly down to GPUs being more efficient than high-end CPUs when it comes to machine learning, with GPUs able to process more in any given period than a CPU, and the benefits extend to powering neural networks too.
There’s also the use of GPUs to mine cryptocurrency, something that actually caused a lack of available stock in the GPU market at its peak back in 2017 and 2018.