To start things off with a bit of a thump, we're going to borrow a little from the film Trainspotting and use some inspiration from Pete's UT review…
Choose no life, choose a shite 3D card, choose wondering why your T&L doesn't work on a cold Sunday morning, choose a three piece factory leisure suit, choose a set of tacky dining room chairs, no health insurance and choose a Savage 2000+ with T&L on the box, or choose life. Choose the fastest 3D card. Choose 1.6 GigaTexels/sec. Choose four pixels per clock. Choose per pixel shading, choose all that bloody fruit in games and dot product3 bump mapping. Choose playing Quake III: Arena with a BFG and texture compression and geometry on full. Choose hardware motion compensation, hardware DVD/HDTV playback and a bloody f***ing big television. Choose a new GeForce2 GTS. But why on Earth would you want to do a thing like that?
With the release of the original TNT, NVIDIA went from being an also-ran 3D company with one near-death experience under their belt to being a strong player in the consumer real-time 3D graphics chip biz. That was September 1998, only eighteen months ago.
Six months later, in March of 1999, NVIDIA shipped their TNT2, with which they took the 3D speed crown from 3dfx and made 32-bit 3D home gaming a reality. And as if that weren't enough, after another seven months, they released the GeForce 256. With a high fill-rate eventually fed by DDR memory, quality 32-bit 3D color and a hardware Transformation and Lighting engine, the GeForce 256 not only bested the competition, it has dominated the 3D OEM market 'till this very day.
Another six months have passed since the release of the GeForce 256. With OEMs releasing their Spring product lineup, it's time for something new. Therefor, NVIDIA Corporation has announced their GeForce2 GTS, the successor to the GeForce 256.
Today we report on putting one through a comprehensive set of benchmarks to show you how well it performs and we also take an in-depth look at the technology behind the GeForce2 GTS. NVIDIA claims a doubling of real world performance over the GeForce 256 SDR board. Does the card meet the claims?