GeForce FX
|
NVIDIA_GeFORCE-FX_logo.png
NVIDIA GeForce FX
The GeForce FX is a graphics card in the GeForce line, from the manufacturer NVIDIA. The fastest model (GeForce FX 5950 Ultra) appears comparable to competitor ATI Technologies's Radeon 9800 XT.
It features DDR, DDR-II or GDDR-3 memory, a 130 nanometer fabrication process, and a wide swath of additional features, including the most robust vertex shader and pixel shader engines available. It is fully DirectX 9 compliant. Limited samples of the GeForce FX were sent to the press on November 18, 2002, and mass production was expected to begin in December 2002, with cards finally reaching the market by February, 2003.
The Geforce FX also included an improved VPE (Video Processing Engine), which was first deployed in the Geforce4 MX. The main improvement was per-pixel video-deinterlacing, a feature first offered in ATI's Radeon, but which saw little use until the maturation of Microsoft's DirectX-VA and VMR APIs.
Beaten to market by ATI's Radeon9700, NVIDIA planned for the GeForce FX 5800 to regain the high-end 3D-performance crown. These plans, however, did not come to fruition. The initial version of the GeForce FX (the 5800) was so large that it required two slots to accommodate it, required a massive heat sink and cooling fan arrangement, produced a great deal of noise from its fan, and did not rate well in competitive performance testing. After a late and low-key introduction, NVIDIA withdrew it from the market pending design revisions.
The second iteration of the GeForce FX range, however, did much to redress the balance. NVIDIA introduced the mid-range 5600 and low-end 5200 models, which were priced far more sensibly and required only a single-slot cooling system.
With the launch of the GeForce FX 5900, NVIDIA fixed many of the problems of the 5800. The 5900 performed better and had a quieter cooling system. However, the card still required two slots.
NVIDIA later attacked ATI's mid-range card, the Radeon 9600, with the GeForce FX 5700 and 5900XT. The 5700 was a new card based on the NV35 architecture using DDR II memory. High prices of DDR II memory kept the FX 5700 expensive, leading NVIDIA to introduce the FX 5900XT. The 5900XT is identical to the 5900 but runs slower, using slower memory and less elaborate power circuitry.
Although NVIDIA had ceded market leadership to ATI in the first half of 2003, it regained much of the support it lost with the FX 5800.
There are also "GeForce PCX" versions of the 5200, 5700 and 5950. These are much the same as the "normal" versions, but they have a bridge linking their native AGP bus to a PCI Express x16 link. They also do not require external power supplies, as the PCI Express bus supplies the required power.
Near the end of its life, the GeForce FX series became notorious for poor performance with DirectX 9 Vertex & Pixel shaders, leading to it suffering significant defeats in newer games and leading to the infamous early release of Half-Life 2 benchmarks which showed the GeForce FX 5900 Ultra performing about as fast as a Radeon 9600 (which generally only cost around a third as much as a 5900 Ultra). When the game was released, Valve forced DirectX 8 shaders on GeForce FX hardware.
In sales terms, the GeForce FX series was about as successful as its immediate predecessor was (though not as successful as the GeForce 2). Among hardware enthusiasts, however, the series was seen as a great disappointment.
GeForce FX Models
Name | Codename | PCI ID | Memory Bus | Notes |
GeForceFX 5200 | NV34 | 0322 (non ultra) / 0321 (Ultra edition) | 64 or 128 bit | Replacement for GeForce4 MX family. Slower than the GeForce4 Ti 4200 at everything excluding DirectX 9 operations. Based on GeForceFX 5600. 128-bit memory bus by default. Quadro FX 500 is based on the GeForceFX 5200. Lacked IntelliSample technology. |
GeForceFX 5600 | NV31 | 0311 (Ultra) / 0322 (non-Ultra) | 64 or 128 bit | Midrange chip. Still slower than its predecessor, the GeForce4 Ti 4200 in some operations. No Quadro equivalent. Initial 5600 Ultras were clocked at 325MHz for core and memory, but later revisions increased the speed to 400MHz. |
GeForceFX 5800 | NV30 | 0300 (Engineering samples), 0301 (Ultra),
0302 (non-Ultra) | 128 bit (DDR II) | Replacement for the GeForce4 Ti 4800. Production was troubled by migration to 130 nanometer processes at TSMC. Produced a lot of heat while running. Cooler was nicknamed the 'Dustbuster', 'Vacuum Cleaner', or 'Hoover' by some sites; NVIDIA later released a video mocking the cooler. These issues caused the it to be quickly replaced by the GeForceFX 5900.
Its Quadro sibling, Quadro FX 2000 was somewhat more successful. |
GeForceFX 5900 | NV35 | 0330 (Ultra), 0311 (non-Ultra) | 256 bit | Swapped hardwired DirectX 7 T&L Units
+ DirectX 8 integer units for DirectX 9 Floating point units. Introduced a new feature called 'UltraShadow', upgraded to CineFX 2.0 Specification. Removed the noisy cooler, but still stole the PCI slot adjacent to the card by default. Quadro equivalent is QuadroFX 3000. |
GeForceFX 5950 | NV38 | 0333 | 256 bit | Essentially a speed bumped GeForceFX 5900. Several antialiasing and shader unit tweaks in hardware. Several users have been able to 'soft-mod' their GeForce FX 5900 to a 5950. |
GeForceFX 5700 | NV36 | 0341 | 128 bit (DDR II/GDDR-3) | Essentially a modified NV35 chip designed to replace the GeForce FX 5600. It has had more success than its parent, the GeForceFX 5950 at fighting ATI, beating the Radeon 9600 in a lot of tests. Quadro equivalent is the Quadro FX 1100. Later models were equipped with GDDR-3, which was also clocked higher than the DDR-II modules previously used. |
GeForceFX 5900XT / 5900SE | NV35 | 0331 | 256 bit | A down-clocked (slower, but exactly the same in functionality) GeForce FX 5900, and priced just above the GeForceFX 5700 Ultra (around $180-$220 USD). Typically uses slower memory and has a single-slot cooling solution. Note: can be soft-mod'ed to gain 20% extra performance, thus beating the 5900 series without any extra investment or cooling. |
External links
- NVIDIA: Cinematic Computing for Every User (http://www.nvidia.com/page/fx_desktop.html)