We all know that a CPU (Central Processing Unit) is the brain of every PC and laptop. In addition to the CPU, PC's now come with something called a GPU (Graphics Processing Unit) too. This special processor takes over the tasks of rendering (drawing) all the complicated graphics from the CPU. Everything from game consoles to workstations to PCs come implanted with a GPU. GPU, video card, or graphics card are often used interchangeably. We'll stick to GPU. Now, why do we need a GPU in the first place? As we can all witness, onscreen graphics effects, life-like motion and eye-candy are getting increasingly intense and therefore complicated each passing year. We need a GPU to process these graphics and to handle stuff like 3D gaming. A GPU provides the horsepower required to tackle the complex algorithms that produce today's intensely photorealistic onscreen imagery and fluidity of movement, sans choppiness, lag or drag. And it's not just about fun and games. Contemporary GPUs are looked up as high-performance multi-core processors that are capable of speeding up a vast range of science and engineering applications as well. In fact, there are experts who believe that in some years, the real processing power in PCs will be all about GPU horsepower, the CPU. Most PCs have integrated GPUs while higher-end ones have dedicated (also known as, discrete) GPUs. GPUs are memory intensive creatures. While dedicated graphics cards function on their own memory, integrated graphics need to use the system RAM. Dedicated GPUs are an expensive proposition and not generally required by all and sundry. Typically, applications that need parallel processing (3D gaming, science and engineering applications etc.) benefit the most from GPUs. Three players dominate the global PC graphics market right now: Nvidia, ATI and Intel. Nvidia and ATi produce dedicated cards while Intel's graphics products are integrated into its motherboard chipsets.