NVIDIA
|
NVIDIA_logo.png
NVIDIA Logo
NVIDIA Corporation Template:Nasdaq is a major supplier of graphics processors (graphics processing units, GPUs), graphics cards, and media and communications devices for PCs and game consoles (Xbox). It is headquartered in Santa Clara, California. In 2001, it had revenues of $1.37 billion USD, rising from $735.3 million in 2000. Net income in 2001 comprised $177.1 million, up from $99.9 million.
Contents |
History
Jen-Hsun Huang, Chris Malachowsky, and Curtis Priem founded the company in January 1993 and incorporated it in California in April 1993 (later re-incorporating it in Delaware). The company remained relatively low-key until the late 1997-98 period, when it launched its line of RIVA PC graphics processors. It went public in January 1999 on Nasdaq; in May of that year it shipped its 10 millionth graphics processor. In 2000 it acquired the intellectual assets of one-time rival 3dfx, one of the biggest graphics companies of the mid to late 1990s. NVIDIA established close ties with many OEM companies as well as with organizations like SGI. By February 2002, NVIDIA had shipped 100 million processors.
Today, NVIDIA and ATI Technologies supply the majority of "discrete" graphics chips found in modern mainstream PCs. NVIDIA's GeForce line of graphics processors, first launched in 1999, is roughly comparable to ATI's Radeon line.
As a "fabless" high-tech company, NVIDIA conducts research & development of chips in-house, but subcontracts the actual (silicon) manufacturing to third-parties. In the past, NVIDIA has sourced silicon production capacity from STMicroelectronics, TSMC, and IBM. The production-chain of a chip involves multiple third-parties: the foundry makes processes wafers, the test-house tests the dies for defects and sorts them based on performance-characterization, and the packager seals individual dies in a hardened case. In terms of inventory management, NVIDIA must place foundry orders months in advance of their planned sale, then hold the produced chips in a warehouse until final delivery. This leads to occasional supply/demand imbalances.
Products
NVIDIA's product portfolio includes graphics processors, wireless communications processors, PC platform (motherboard core-logic) chipsets, and digital media player software. Within the Mac/PC user community, NVIDIA is best known for its "Geforce" product line, which is not only a complete line of "discrete" graphics chips found in AIB (add-in-board) video cards, but also a core-technology in both the Microsoft Xbox game-console and NForce-motherboards.
In many respects, NVIDIA is similar to its arch-rival ATI, in the sense that both companies began with a focus in the PC market, but later expanded their businesses into chips for non-PC applications. With respect to the PC graphics market, NVIDIA and ATI differ in one key respect: Unlike ATI, NVIDIA does not sell graphics boards into the retail market, instead focusing on the development and manufacturing of GPU chips. Therefore, NVIDIA does not maintain a direct presence in the consumer retail-market, whereas ATI sells graphics boards under its own "Built-by-ATI" namebrand. As part of their operations, both ATI and NVIDIA do create "reference designs" (board schematics) and provide manufacturing samples to their board partners.
In December 2004, it was announced that NVIDIA would be assisting Sony with the design of the graphics processor in the upcoming Sony PlayStation 3 game-console. Although exact details of the collaboration are not publically known, Sony has confirmed NVIDIA will be manufacturing the PS3's GPU. This is a departure from NVIDIA's business arrangement with Microsoft, in which NVIDIA alone manages procurement and delivery of the Xbox GPU through NVIDIA's usual third-party foundry contracts. (Meanwhile, Microsoft has chosen ATI for the Xbox 360's graphics hardware, as has Nintendo for their console to supersede the ATI-based GameCube.)
- "Discrete" refers to the graphic chip's boundary/proximity to other PC hardware. In the context of PC graphics, the term "discrete" is used to draw a distinction between "integrated graphics", where graphics-functionality is wholly contained in the motherboard "core-logic" chipset, and "discrete" graphics, where the functionality is located in separate chip(s) outside of the core-logic chipset.
Graphics chipsets
NVIDIA_GeForce_6_Series_logo.png
NVIDIA GeForce 6 Series logo
- NV1 - NVIDIA's first product based upon quadratic surfaces
- Riva 128, Riva 128ZX - NVIDIA's first DirectX compliant hardware
- Riva TNT, Riva TNT2 - The series that made NVIDIA a market leader
- GeForce
- GeForce 256 - Hardware transform and lighting
- GeForce 2 - DirectX 7 support
- GeForce 3 - DirectX 8 shaders
- GeForce 4 - High end parts and a new budget core
- GeForce FX series - DirectX 9 support, claimed to offer 'cinematic effects'
- GeForce 6 series - DirectX 9C support, features improved shaders and reduced power consumption
- GeForce 7 series - Improved shading performance, Transparency Supersampling(TSAA)/Transparency Multisampling(TMAA) Anti-Aliasing
- Quadro (GeForce based professional chipsets)
Personal computer platforms / chipsets
- nForce
- nForce IGP (AMD Athlon/Duron K7 line, "IGP" for integrated graphics)
- nForce 2 (AMD Athlon/Duron K7 line, SPP (system platform processor) or IGP, also features SoundStorm)
- nForce 3 (AMD Athlon 64/Athlon 64 FX/AMD Opteron, SPP only)
- nForce 3 Mobile (AMD Athlon 64 mobo/Transmeta Crusoe)
- nForce 4 (PCI Express support for AMD64 processors and SLI technology)
- Xbox (Intel Pentium III Celeron)
Market trends
NVIDIA's original graphics card called the NV1 was released in 1995, based upon quadratic surfaces, with an integrated playback only soundcard. Because the Sega Saturn was also based upon forward-rendered quads, several Saturn games were converted to NV1 on the PC, such as Panzer Dragoon and Virtua Fighter Remix. However, the NV1 struggled in a market place full of several competing proprietary standards.
Market interest in the product ended when Microsoft announced the DirectX specifications, based upon polygons. Subsequently NV1 development continued internally as the NV2 project, funded by several millions of dollars of investment from Sega. Sega hoped an integrated sound and graphics chip would cut the manufacturing cost of their next console. However, even Sega eventually realised quadratic surfaces were a flawed implementation, and there is no evidence the chip was properly debugged. The NV2 incident remains something of a dark corporate secret for NVIDIA.
NVIDIA's CEO Jen-Hsun Huang realised at this point after two failed products, something had to change if the company was to survive. So he hired David Kirk as Chief Scientist from software developer Crystal Dynamics, a company renowned for the visual quality of its titles. David Kirk turned NVIDIA around by combining the company's 3D hardware experience, with an intimate understanding of practical implementations of rendering.
As part of the corporate transformation, NVIDIA abandoned proprietary interfaces, sought to fully supported DirectX, and dropped multimedia functionality, in order to reduce manufacturing costs. NVIDIA also adopted an internal 6 month product cycle goal. The future failure of any one product would not threaten the survival of the company, since a next generation replacement part would always be available.
However, because the Sega NV2 contract was secret, having laid off employees, at the time to many industry observers it looked as if the company was dead in the water. So when the RIVA 128 was first announced in 1997, the specifications were hard to believe. Performance superior to market leader 3dfx Voodoo Graphics, and a full hardware triangle setup engine. The Riva 128 did ship in volume, the combination of low cost and high performance 2D/3D acceleration made it a popular choice for OEMs.
Having finally developed and shipped in volume the market leading integrated graphics chipset, NVIDIA's set the internal goal of doubling the number of pixel pipelines in its chip, in order to realise a substantial performance gain. The TwiN Texel (Riva TNT) engine NVIDIA subsequently developed, allowed either for two textures to be applied to a single pixel, or for two pixels to be processed per clock cycle. The former case allowing for improved visual quality, the later doubling maximum fill rate.
New features included a 24-bit Z-buffer with 8-bit stencil support, anisotropic filtering, and per-pixel MIP mapping. In certain respects such as transistor count, the TNT had begun to rival Intel's Pentium processors for complexity. However, while the TNT offered an astonishing range of quality integrated features, it failed to displace the market leader Voodoo 2, because the actual clock speed ended up at only 90 MHz, about 35% less than expected.
However, this was only a temporary respite for Voodoo, as NVIDIA's refresh part was a die shrink for the TNT architecture from 250 nm to 135 nm. Stock TNTs now ran at 125 MHz, ULTRAs at 150 MHz. The Voodoo 3 was barely any faster, lacked features such as 32 bit color, and was not an integrated 2D/3D solution either, increasing cost. The Riva TNT2 marks a major turning point for NVIDIA. They had finally delivered a product competitive with the fastest on the market, with a superior feature set, strong 2D functionality, all integrated onto a single die with strong yields, that ramped to impressive clock speeds.
Not content to sit back, the fall off 1999 saw the release of the GeForce 256 (NV10), most notably bringing on-board transformation and lighting. The GF256 was also implemented advanced video acceleration, motion compensation, hardware sub picture alpha-blending, and had four-pixel pipelines. When combined with DDR memory support, NVIDIA's technology was the hands down performance leader.
Basking in success of its products, NVIDIA won the contract to develop the graphics hardware for Microsoft’s Xbox. The result was a huge $200 million advance. However, the project drew the time of many of NVIDIA's best engineers. Short term this was of no importance, and the GeForce 2 GTS shipped in the fall of 2000.
The GTS benefited from the fact NVIDIA had by this time acquired extensive manufacturing experience with their highly integrated cores, and as a result they were able to optimise the core for clock speeds. The volumes of chips NVIDIA was producing, also enabled them to bin split parts, picking out the highest quality cores for their premium range. As a result, the GTS shipped at 200 MHz. The pixel fill rate of the GF256 nearly doubled, and texel fill rate near quadrupled because multi-texturing was added to each pixel pipeline. New features included S3TC compression, FSAA, and improved MPEG-2 motion compensation.
More significantly, shortly afterwards NVIDIA launched the GeForce 2 MX, intended for the budget / OEM market. It had 2 pixel pipelines fewer, and ran at 120 MHz. Offering strong performance at a bargain basement price, the GeForce 2MX is probably the most successful graphics chipset of all time. A mobile version called the GeForce2 Go was also shipped at the end of 2000.
All of which finally proved too much for 3dfx whose Voodoo 5 had been delayed, and the board of directors started the process of dissolving 3DFX. This became one of the most spectacular and public bankruptcies in the history of personal computing. NVIDIA purchased 3dfx primarily for the intellectual property which was in dispute at the time, but also acquired anti-aliasing expertise, and about 100 engineers.
At this point NVIDIA’s market position looked unassailable, and industry observers began to refer to NVIDIA as the Intel of the graphics industry. However while the next generation FX chips were being developed, many of NVIDIA’s best engineers were working on the Xbox contract, developing the SoundStorm audio chip, and a motherboard solution.
It is also worth noting Microsoft paid NVIDIA for the chips themselves, and the contract did not allow for falling manufacturing costs, as process technology improved. Microsoft eventually realised its mistake, but NVIDIA refused to renegotiate the terms of the contract. As a result, NVIDIA and Microsoft who had previously worked very closely, fell out. NVIDIA was not consulted when the DirectX 9 specification was drawn up. Apparently as a result, ATI designed the 9700 to fit the DirectX specifications. Rendering color support was limited to 24-bits, and shader performance had been emphasized throughout development, since this was to be the main focus of DirectX 9.
In contrast, NVIDIA’s cards offered 16 and 32 bit modes, offering either poor visual quality, or slow performance. The 32 bit support made them much more expensive to manufacture requiring a higher transistor count. And shader performance was often only half or less the speed provided by ATI. Having made its reputation by providing easy to manufacture DirectX compatible parts, NVIDIA had misjudged Microsoft’s next standard, and was to pay a heavy price for this error. As more and more games started to rely on DirectX 9 features, the poor shader performance of the GeForce FX series became ever more obvious.
NVIDIA started to become ever more desperate to hide the shortcomings of the GeForce FX range. A notable 'FX only' demo called Dawn was released, but the wrapper was hacked to enable it to run on a 9700, where it ran faster despite a translation overhead. NVIDIA also began to include ‘optimizations’ in their drivers to increase performance. While some of these were valid, hardware review sites started to run articles showing how NVIDIA’s driver autodetected benchmarks, and produced artificially inflated scores, that did not relate to real world gaming performance. Oftentimes it was tips from ATI’s driver development team that lay behind these articles. As NVIDIA’s drivers became ever more full of hacks and ‘optimizations,' the legendary stability and compatibility also began to suffer.
Furthermore, the GeForce FX series also ran hot, because they drew as much as double the amount of power as equivalent parts from ATI. The NV30 became notorious for the fan noise, and acquired the nickname ‘Dustbuster.’ While it was withdrawn and replaced with quieter parts, NVIDIA was forced to ship large and expensive fans on its FX parts. The 5700 Ultra as a late revision, ended up as the only FX series part with competitive shader performance. As a result, quite unexpectedly NVIDIA lost its market leadership position to ATI.
While the GeForce 6 series has addressed the issues that plagued the FX series, namely shader performance and power consumption, recovering their once dominant market position has proved difficult. By working closely with developers most especially as part of NVIDIA's the way it's meant to be played program, NVIDIA aims to never again make the mistake of producing hardware out of sync with industry expectations.
One bright spot for NVIDIA's Geforce is the Linux and FreeBSD market. NVIDIA offers support by the way of binary graphics drivers for X11 and a thin open-source library that interfaces between the Linux kernel or FreeBSD kernel and the proprietary graphics software. Most Linux users feel that NVIDIA's Linux binary drivers enable them to outperform similarly-priced ATI hardware, although ATI is working to address the perceived weaknesses of its Linux drivers.
External links
- NVIDIA's website (http://www.nvidia.com/)
- Tweakguides.com "Nvidia Forceware Tweak Guide" (http://www.tweakguides.com/NVFORCE_1.html)
- Firing Squad: History of NVIDIA (http://www.firingsquad.com/features/nvidiahistory/)
- unofficial nVidia-Linux Installation Wiki (http://www.gmpf.de/index.php/Main_Page)
Template:NVIDIAde:NVidia es:NVidia fr:NVIDIA nl:Nvidia ja:NVIDIA pl:NVIDIA Corporation pt:NVIDIA sv:Nvidia