Tuesday, February 26, 2008

Nvidia – Graphics Unlimited

Welcome to the world of pixels, shaders, graphics processing units (GPUs) and stream processors. We are talking about Nvidia here, the Forbes company of 2007. If you belong to a certain category called as “gamers” (yes, we are humans too!!), then Nvidia would be on your lips 24x7 because it is the holy grail of gaming supercomputing. But, there is more to that. Nvidia is a prime example of clear vision and strategy. Not only do they set the bar, they continue to raise it time after time. Read on.

Origins
Since its inception in 1995, Nvidia has been at the forefront in graphics innovation. It’s Chief Executive Jen-Hsun Huang and cofounders Christopher Malachowsky (Nvidia's vice president of information technology) and Curtis Priem (retired in 2003) noted the low textured graphics which gave a less satisfactory cartoonish feel to games prevalent at that point of time. It was then that Nvidia saw a glimpse of the future. It was all about improving visual and gaming experience.

The GPU – Why is it important?
In simple terms, a GPU completely offloads the graphics processing from the processor thereby ensuring that it is only used for its computational abilities. The high graphics requirements in Windows Vista are a testament to this fact. A GPU’s primary function lies in processing high amounts of raw graphical data. Currently, Nvidia addresses different segments through its different range of cards – The Geforce series for consumer PCs, the Quadro range for animation, Tesla series for the computing requirements of scientific data, GoForce for mobiles and the GeForce M for laptops. To give an idea about the graphical requirements today consider this - an Intel Core 2 Quad processor has up to 4 processing cores on a single chip; Nvidia's new GeForce 8800 GTS has an astounding 128.

The road to glory
A decade ago games did not require an additional GPU setup because they were not graphically intensive. A processor was enough to run them. As the degree of complexity increased in game creation so did the requirements. Then emerged the GPU, which in layman terms is the graphics card. Since then, Nvidia and ATI have slugged it out in the GPU industry, which recently Nvidia has dominated with absolute authority. Not only it has enhanced gamer experience, but it allowed game developers to bring innovation into the mainstream. Its Geforce 8 series, especially the mid-range 8800 GTS and high-end 8800 GTX and 8800 Ultra have confirmed its dominance in the consumer graphics card segment. These were the only cards in the market that allowed for extensive graphical quality in games at high resolutions, something impossible to experience in older cards.

Their recent acquisition of Aegia Technologies has confirmed that it aims to strengthen their already formidable presence in the gaming segment. As mentioned before, Nvidia has also proliferated into different segments as well – laptops, workstations, high performance computing and mobiles. It was the first company to release NVIDIA Quadro; the World's First Workstation GPU in 1999 and NVIDIA GeForce2 Go; the World's First Mobile GPU. Movies like Spiderman 3 and Open Season have used Nvidia’s Quadro range to render graphics. On the other hand, University of Illinois and Systems Integration have used their Tesla range of cards. Currently Intel still leads the graphics chipset market because of their integrated chipsets in motherboards and mobile phones, spaces that they still dominate. However, Nvidia has been catching up with Intel in the recent past. Nvidia’s market share has increased from 19% in Q2 2006 to 34% in Q3 2007, compared to Intel’s 40% to 38% in the same period.

The X-factor – Innovative technologies
Now, the list can go on stretching for ever but let us restrict ourselves to a few, just to get a taste of what Nvidia capable of. Its scalable link interface (SLI), a technology that allows multiple GPU (through two graphics cards) usage was the first of its kind in the world. In its present form, it continues to push and challenge its own benchmarks in the world of gaming and high end computing (Beware of the new triple SLI feature which allows usage of three cards simultaneously!). Hybrid SLI, a power saving feature which gives the option of using two different types of chipsets (e.g. a low power consuming 8500 GT with a high power draining 8800GTX) for usage in Windows Vista. Its Purevideo feature uses advanced techniques found only on very high-end consumer players and TVs to make Blu-ray, HD DVD, standard-definition DVD movies, PC and mobile device content look crisp, clear, smooth and vibrant.

The road ahead – A smooth sail?
As technologies improve, no doubt GPUs will continue to become more powerful. However, there is a question to ask – Does innovation come at a price and what are its limitations. The current range of GPUs, especially the high end GeForce 8800 GTX and 8800 Ultra consume huge amounts of power. A single card pushes the whole system usage to 360 watts when fully utilized. In addition, if one throws into the mix the technologies of SLI and triple SLI, peak wattages of 500 watts and 750 watts are just a stone’s throw away. Hence, the first challenge lies in the form of minimizing energy usage. To some extent, this problem is being addressed by many semiconductor companies like Intel and AMD through constantly reducing the size of transistors present in their chips. Intel’s Yorkfield and Wolfdale use 45nm technology which has showed reduced power usage. Nvidia currently uses a 65nm chip codenamed as G92 for its new Geforce 8800 GTS card.

However, in recent years there have been limitations to Moore’s law which states that the number of transistors on a chip would continue to double every two years. In his own words, Moore has opined that the semiconductor industry has10-15 years to go before hitting a roadblock. Then remains the competition from consoles – they offer a gaming experience (albeit at a relatively lower graphical quality compared to that of a high end PC) at the price lower than that of a high end GeForce graphics card. Since the last four years, Nvidia has always been in a position to charge at a premium because of the quality it offers. This trend is set to continue unless ATI comes up with a monster. As Jen-Hsun Huang rightly knows, if one can deliver something exciting and newer, even if for just 6-8 months, that fickle customer will be mighty pleased.

By Dipankar Mohanty

No comments: