Home

News

Forums

Hardware

CPUs

Mainboards

Video

Guides

CPU Prices

Memory Prices

Shop



Sharky Extreme :


Latest News


- Outdoor Life: Panasonic Puts 3G Wireless Into Rugged Notebooks
- Averatec Launches Lightweight Turion 64 X2 Laptop
- Acer Fires Up Two New Ferrari Notebooks
- Belkin Debuts Docking Station for ExpressCard-Equipped Notebooks
- Logitech 5.1 Speaker System Puts Your Ears At Eye Level
News Archives

Features

- SharkyExtreme.com: Interview with ATI's Terry Makedon
- SharkyExtreme.com: Interview with Seagate's Joni Clark
- Half-Life 2 Review
- DOOM 3 Review
- Unreal Tournament 2004 Review

Buyer's Guides

- September High-end Gaming PC Buyer's Guide
- September Value Gaming PC Buyer's Guide
- October Extreme Gaming PC Buyer's Guide

HARDWARE

  • CPUs


  • Motherboards

    - Gigabyte GA-965P-DS3 Motherboard Review
    - DFI LANPARTY UT nF4 Ultra-D Motherboard Review

  • Video Cards

    - Gigabyte GeForce 7600 GT 256MB Review
    - ASUS EN7900GT TOP 256MB Review
    - ASUS EN7600GT Silent 256MB Review
    - Biostar GeForce 7900 GT 256MB Review





  • SharkyForums.Com - Print: High-End Videocard Shootout = BS!!!

    High-End Videocard Shootout = BS!!!
    By GTaudiophile October 09, 2000, 08:09 AM

    Ok, today children, we're going to take 1 3Dfx card, 1 ATI card, and 6 FREAKING NVIDIA CARDS, and see which three are the best overall. Of course children, since we're using 6 cards based on a GPU from ONE company, the law of averages states that the best three will naturally be from THAT company. So, children, instead of giving the bronze to 3Dfx, the silver to ATI, and the gold to nVidia, we're going to blow sunshine up the ass of nVidia by giving them ALL THREE freaking awards. Oh, and by the way children, our next lesson will be about how to cash checks from nVidia!

    WHAT A BUNCH OF BS, SHARKY!!! Do you actually think we're THAT dumb here??? We are not blind! Just because the GF2-Ultra was delayed and ATI announced a roadmap that will destroy nVidia's crown doesn't mean you all should try pulling the blanket over our eyes! I MEAN COME ON!!! The Radeon 32DDR bests the GF2 cards in one benchmark, stays within 4 FPS in the other benchmarks, has arguably the best 3D image quality, has the easily the best DVD playback ability, and COSTS $100 LESS THAN the competition!!! I mean, WTF???

    The ATI Radeon 32DDR is CLEARLY a best videocard, and just because of your damned bias article, I will buy an ATI Radeon INSTEAD of a GF2MX. So shove that!!!

    By BloodRed October 09, 2000, 08:22 AM

    First off, I haven't read the article because I can't get to SharkyExtreme here at work. Damn proxies! Anyway....

    Maybe he felt he should compare ALL the products on the market. It's not Sharky's fault that nVidia has released three cards in the time it's taken their competition to release one. This sounds like another, "Waaaa! My video card didn't win..." rant. I'm sure that you would have praised the review if there had been three ATI cards in it.

    By GTaudiophile October 09, 2000, 08:34 AM

    They are all pretty much the same videocards (powered by the GeForce2 GPU), just made by different brands (Creative, Elsa, Asus, etc). Sharky could have used the best one to compare to the 3Dfx and ATI cards, then given out awards.

    By Humus October 09, 2000, 09:57 AM

    I agree, if they had taken one GTS, one V5 and one Radeon it would be much better. And why is image quality only worth 10 points??? I don't get it ...

    By Wotan81 October 09, 2000, 10:01 AM

    I totally agree with GTaudiophile!!!

    Sharky swims in murky nVidia waters! How much did they pay You?

    By toonzwile October 09, 2000, 05:59 PM

    quote:Originally posted by Wotan81:
    I totally agree with GTaudiophile!!!

    Sharky swims in murky nVidia waters! How much did they pay You?


    You guys are a bunch of dumb***es... nvidia has the most products out to market right now, and each designer brings new things to their product... so u get different varieties. If ATI didnt have the $$ foundation, im sure there would be a bunch of Radeon cards floating around, and same goes for 3dfx. ATI and 3dfx are both larger companies than nvidia, and have chosen to produce their own products without outside manufacturers. Nvidia didnt choose that path. 6 months from now, if ATI holds to their roadmap, there will be 6 new video cards from ATI, hopefully an NV20 product, and -- this is by far the LONG SHOT -- a Rampage product from 3dfx. When Sharky decides to do a review and they include 6 Radeon cards with 1 from Nvidia & 3dfx each, then WTF are u idiots gonna say??!

    * Sorry for the language, Sharky.

    By Doward October 09, 2000, 06:19 PM

    Why no 64 meg Radeon? Isn't it 183/183 gonna help?

    And yeah, 6 NVidia cards wass kinda dumb. Maybe 6 diff cores would be ok, but not all the same. Come on!

    By bclothie October 09, 2000, 06:37 PM

    Originally posted by toonzwile:

    >nvidia has the most products out to market
    >right now, and each designer brings new
    >things to their product...

    No, NVidia doesn't and these "designers" just clone reference boards. Just looking at the benchmarks of GeForce2 products should tell you that they are pretty much all the same.

    Basically, NVidia has two major products shipping: the GeForce2, and the GeForce2 MX. These graphics cores have been paired with different memory combinations to satisfy different price points. For example, you have the GeForce2 GTS (32MB and 64MB DDR SDRAM), the GeForce MX (32MB SDR, 16MB DDR), etc.

    ATI has a similar line-up. For example, ATI has the 166MHz Radeon core and the 183MHz Radeon core. They are split into three retail categories: Retail 64MB Radeon (183MHz, Rage Theater included), Retail 32MB Radeon (166MHz, no Rage Theater), Retail All-In-Wonder 32MB Radeon (166MHz). And a myraid of OEM products: OEM 64MB Radeon (166MHz, no Rage Theater available yet), etc. These also address different price points. Actually, I consider the All-In-Wonder a distinct product because of its TV Tuner and integrated video capture features.

    If you move down the video segment, NVidia has the GeForce256 and TNT2 while ATI has the Rage 128, Rage 128 Pro, and Rage Fury Maxx. With no disrespect intended to 3Dfx, I will stop my argument here because I have already made my point.

    >so u get different varieties. If ATI didnt
    >have the $$ foundation, im sure there would
    >be a bunch of Radeon cards floating around,
    >and same goes for 3dfx. ATI and 3dfx are
    >both larger companies than nvidia, and have
    >chosen to produce their own products
    >without outside manufacturers.

    Wrong again. ATI is bigger than both NVidia or 3Dfx. But 3Dfx is not larger than NVidia in terms of cash flow or revenue. In terms of employees, 3Dfx is larger than NVidia but that fact is meaningless. You correctly pointed out the NVidia out-sources its card manufacturing. In those terms, I would argue that if you included NVidia's manufacturing support, NVidia is larger than 3Dfx.

    >Nvidia didnt choose that path. 6 months
    >from now, if ATI holds to their roadmap,
    >there will be 6 new video cards from ATI,
    >hopefully an NV20 product, and -- this is
    >by far the LONG SHOT -- a Rampage product
    >from 3dfx. When Sharky decides to do a
    >review and they include 6 Radeon cards with
    >1 from Nvidia & 3dfx each, then WTF are
    >u idiots gonna say??!

    No, actually your ignorance of both NVidia's and ATI's product line-ups is the point here. The actual number of products will stay roughly the same. Why? Because the price points in the video care market remain the same. Older products will be phased out of the market to make room for new ones.

    For example, when NVidia introduces the GeForce2 Pro and GeForce2 Ultra, the GeForce2 and GeForce MX will be shifted down. The TNT2 will disappear, perhaps preserved as an integrated Northbridge solution but that is it. Watch the GeForce256 vanish entirely.

    The same will happen with ATI's line when the Radeon II is introduced. I do, however, think the Radeon All-In-Wonder is a unique product. NVidia offers nothing like it, nor do its manufacturing partners. This product is unique based on a truely value-added feature set.

    By troop October 09, 2000, 07:36 PM

    Yeah, this is an article with a huge flaw but using several identical video adapters is not it. That is only redundant. The flaw is contained here :

    "All benchmarks were run with a desktop color depth of 16-bits,"

    and then we see this nonsense :

    "The GeForce2-based cards rule the table here in 16 and 32-bit color "

    Stupid .


    And the importance of this error is shown in the result of the only 32bit test results displayed :

    "Re-Volt 1600 x 1200 x 32

    ATI's Radeon dominates on both of our performance platforms,scoring 60.2 on the Intel system and 60.1 on the AMD machine"


    The test purports to be a complete analysis yet mostly analyzes 16 bit performance showing one adapter the clear winner. But at the resolution most power users employ--32bit--another card dominates.


    So what's the point of this effort ?


    By BloodRed October 09, 2000, 09:14 PM

    quote:Originally posted by Doward:
    Why no 64 meg Radeon? Isn't it 183/183 gonna help?

    And yeah, 6 NVidia cards wass kinda dumb. Maybe 6 diff cores would be ok, but not all the same. Come on!

    It was a review of 32MB cards, that's why the 64MB Radeon isn't in there. Yes, the V5 is a 64MB card, but Sharky specifically says in the review that it's in there because it's not a true 64MB card, as each chip only has 32MB to work with. As for the different cores, you're missing the point. He isn't writing an nVidia vs. 3DFX vs. ATI review, he's writing a review of ALL the major high-end video card on the market today. Like it or not, Joe Schmoe that goes to a hardware store to buy a video card is probably going to see all those cards lined up side-by-side, and he isn't going to know the difference between one or the other. That's who this article was geared towards, and that's why there at 6 nVidia card on there and only one V5 and Radeon. You all are reading way too much into this, get over it.

    By Ymaster October 09, 2000, 09:49 PM

    How dare you all for putting this work down! I realy do not see how you can even think twice about this as bad...We all asked for a round up of the best cards out todate and that is what we got. I felt they held there own on every question you nagged about. Do you have that many cards to bench? No! You must have not read the artical in full detail.

    Sharky said the V5 was really a 32 duel..I think if voodoo thinks there are all that and a box of triskets. They should have been knocket out of the race and compared to the other 64 bit cards. But doing that would make the voodoo5 look worse off than it already is. So I think he did the card some justice. It is clear the Voodoo5 came out to late and crashed big time. This happens in the hardware field. No matter how good a company is they can make a few mess ups..

    So with that aside..I felt this artical was long over due. We needed to see a range of newcards to put in the ring and fight it out. How many times have you ever asked yourself, "I need a new cards and there are so many. What do I get?" Well, this was to help just that problem!

    From this artical I stepped backed and said "I was right all along!". The major cards (nvidia) are all benching the same. No matter what they add-in or take-off the cards. Either drivers of thiers or nvidias..They all are pretty much even in the speed field. As for the rest of the artical. They went into great detail about why one card was better than another. And I give them great credit to the hardware review world. There was no bias! Nivdia dose not send them a check in the mail! I'm sorry if you think they ever did. But thats just BS thinking. Show me the check stub and I'll think other wise. The simple fact that there are more nvidia gpu cards is not biasing the artical. This is the life in todays current world market.

    I don't see 20+ 3dfx cards..or ATI cards..So there are no others to bench in the 32meg card range...

    By Ben October 09, 2000, 11:03 PM

    The review definately gets an F from me. They could easily have included two geforce 2's. One with tons of features, and one with no frills. It would be statistically improbable for a non-nvidia card to win this "shootout." So 3dfx and ATI are at an immediate disadvantage because they are outnumbered 3 to 1. This is a complete failure of the "scientific method" principle we were all taught in high school.

    By bclothie October 10, 2000, 12:49 AM

    First Problem:
    The stupid thing about this review is that 1st, 2nd, and 3rd place awards go to boards all based on the same reference design. Why even bother differentiating between them? You might as well have a single winner.

    The entire article treats the NVidia GeForce2 as a single product during image quality, DVD playback, etc. Then, during benchmarking, all of a sudden there are 6 "different" products? No way. Look at the benchmarks for the GeForce2 cards. They are all < 1% of one another. Same product, different label. There are not more NVidia products, just more manufacturers. The perception of choice is artificial.

    Second Problem:
    I grow weary of seeing video card benchmarks that amount to nothing more than: Which video card provides the fastest frame rates in Quake II/Quake III? iD Software makes fantastic games but how many people just want a "Quake accelerator"? Given the amount of Quake benchmarking, the addition of Re-Volt seems almost an after-thought. To rank the cards the way the reviewer did, a more broad sense of benchmarks should have been taken.

    Different people will purchase a video card for different reasons. More valid for helping consumers make choices:

    (1) Which card is best for running OpenGL-based games? Benchmarks provided for 16-bit and 32-bit color. Provide 2 or 3 OpenGL game benchmarks.
    (2) Which card is best for running D3D-based games? Benchmarks provided for 16-bit and 32-bit color. Provide 2 or 3 D3D game benchmarks.
    (3) How is performance dependent on CPU power for each card? Show some results for a Celeron, P3, Athlon.

    Third Problem:
    The reviewer admits: "In order to minimize the sheer volume of benchmarks we are accustomed to presenting, we have 'trimmed the fat, so to speak, in order to deliver the most relevant scores. All benchmarks were run with a desktop color depth of 16-bits, and of course v-sync was disabled." Oh really?

    The Radeon core is slower in 16-bit color than a GeForce2 but faster in 32-bit color. Sharky Extreme staff knows this from their own separate review of the Radeon 32MB DDR. So how can your benchmarks be "relevant" if by elimination you have already removed results which cause considerable favor to one solution? Do you know for a fact that most gamers will prefer 16-bit over 32-bit even if the fps change is not noticeable? I would suspect many gamers would prefer the rich 32-bit texture maps.

    The weights used in the final table are rather useless to draw the conclusion. The image quality of the Radeon is ranked the same as the GeForce2. Yet, once again, in the separate Radeon 32MB DDR, the image quality of ATI's offering is hawked as much better than the NVidia product. This completely undercuts the credibility of the review. Even the weight of 10 is somewhat subjective. Why not 15 or even 20?

    I question the inclusion of "Ease of Use" in a review that focuses mostly on performance and card features. Seriously, all you do is plug the card in the AGP slot, install drivers, etc. So one card is all that different from another? Enough to merit a category weighted 10 points? Sorry. Just a mention that everything went smoothly would have been fine.

    Also, the fact the "Price" has something to do with "Bundle" completely escapes the reviewer. Obviously the PowerColor's bundle is of lower quality than other GeForce2 cards. That is why the price is rock-bottom. Then we move on to ATI. The Radeon gets 5 points less than the winner ELSA in the "Bundle" category but only 2 points more for "Price". But for $100 less, a consumer could take that savings and purchase games he/she would *really* want. That is a "Bundle". So if we up the Radeon 32MB DDR from 4 points to 9 or 10, guess what happens? The Radeon moves to first place with 87 or 88.

    All in all a poor review.

    By Ymaster October 10, 2000, 01:28 AM

    Will I think it was good.. If you all don't like it then DON'T READ SHARKEXTREME! Go and read Tomsharware or something a leave us to our party.

    By rye222 October 10, 2000, 02:19 AM

    You know what Ben and GTaudiophile, shut the hell up! And the same goes for everyone else out there who has the balls to come to this site and TRY to put down this article (or any other article for that aspect). First of all, if you don't like the review or what SharkyExtreme has to say, then DON'T READ IT! All your cheap comments and ignorant put downs make the rest of us, who actually appreciate their honest and thorough reviews, look bad. SharkyExtreme doesn't charge anyone to come and read the hard-worked reviews that they post. So really, their main inspiration is the positive and APPRECIATIVE feedback from their readers. Also, you say that NVidia puts a check in their pocket for them to say what they say. HELLO. If you actually read the article you would have seen that 97% of their reviews and conclusions are based upon facts and benchmarks that they have clearly posted for everyone to see. Now if you say these benchmarks are fake, then why don't you go buy the cards yourself and post your own results proving them wrong? Instead, you choose to open your big mouth and try to make them look stupid just because your card wasn't picked as the best. Of course, people like you probably think SharkyExtreme puts a check in my pocket to say this, so we'll just let your ignorance further prove my point. And finally, in regards to the shootout being weighted toward NVidia cards just because they compared 6 NVidia Cards to 1 3DFX and 1 Radeon let just stop and think about this logically for one moment. First of all, the title of this review was VIDEO CARD SHOOTOUT, not CHIPSET SHOOTOUT. Therefore, their objective was to post which 3 GRAPHICS CARDS were the best. And if all 3 of them turned out to be NVidia cards, then maybe that should tell you something. Obviously the best 3 cards out right now are all GeForce cards! In fact, the MAIN thing that separates performance between one card and another is its CHIPSET. Therefore, if 3DFX and ATI's chipsets don't perform as well as the GeForce, and there are numerous GeForce cards on the market, then they cannot obviously not be part of the top 3- it's that simple! It doesn't matter if they would have reviewed 50 Radeons, 50 3DFX and 3 GeForce- the 3 GeForce cards would have STILL been the top 3! Therefore, what is the next thing to consider in search for the best 3 cards out right now? Exactly what they discussed- features, software bundles, ETC. I don't know about you, but I am extremely thankful that they even took the time to compare the different GeForce cards out there right now. Because even though they have the same chipset, each on of them has different features. And when I go to buy a card in the store, I want to know which GeForce has what I am looking for- particularly which manufacturer seems to provide the best drivers, stability, software bundle, etc! Anyway, I give SharkyExtreme extreme props for the information they provide. I have always come to their site for the most the most complete, in-depth, thorough and un-biased reviews on the Internet- and I will argue with anyone that begs to differ. And again, if you don't like it, then GO SOMEWHERE ELSE!! Thanks again Sharky and please don't feel unappreciated and/or discouraged by ignorant feedback as posted above. Keep up the good work! -Rye

    By Just October 10, 2000, 02:45 AM

    Buying video cards should be a personal decision, sure you can get ADVICE from friends and experts such as those who roam this board, but it boils down the personel preference. Numbers may mean more to certain people, whereas image quality could mean the world to someone else.

    I hope those who read Sharky's shootout will take into consideration that benchmarks arent everything. Sure he had more GeForce chip-sets in there than other boards, he's only giving you guys an opportunity to see what they can do. It's free information people! Although I dont quite have the budget to buy a high-end card right now, if i did have the money, i would certainly have appreciated what Sharky did.

    No one says you gotta buy nvidia just cause they got a better review or whatever. If you like ATI of 3dfx more, then why even look at those "shoot-outs"? You'r minds are already made up!

    By Wotan81 October 10, 2000, 02:47 AM

    quote:Originally posted by toonzwile:

    You guys are a bunch of dumb***es... nvidia has the most products out to market right now, and each designer brings new things to their product... so u get different varieties. If ATI didnt have the $$ foundation, im sure there would be a bunch of Radeon cards floating around, and same goes for 3dfx. ATI and 3dfx are both larger companies than nvidia, and have chosen to produce their own products without outside manufacturers. Nvidia didnt choose that path. 6 months from now, if ATI holds to their roadmap, there will be 6 new video cards from ATI, hopefully an NV20 product, and -- this is by far the LONG SHOT -- a Rampage product from 3dfx. When Sharky decides to do a review and they include 6 Radeon cards with 1 from Nvidia & 3dfx each, then WTF are u idiots gonna say??!

    * Sorry for the language, Sharky.

    In every other review they say that the visual quality of the Radeon is superior to the GeForce GTS. Sharky is the ONLY one that has given the same score for visual quality to both. That makes You start wondering. Everyone knows that the GTS is faster and that´s OK. But can every other review be wrong about visual quality?

    And btw who are you calling idiots, you inbreed retard Bubba-Billy-Sue!

    By Althor October 10, 2000, 04:13 AM

    quote:Originally posted by bclothie:
    Originally posted by toonzwile:

    >nvidia has the most products out to market
    >right now, and each designer brings new
    >things to their product...

    No, NVidia doesn't and these "designers" just clone reference boards. Just looking at the benchmarks of GeForce2 products should tell you that they are pretty much all the same.

    Basically, NVidia has two major products shipping: the GeForce2, and the GeForce2 MX. These graphics cores have been paired with different memory combinations to satisfy different price points. For example, you have the GeForce2 GTS (32MB and 64MB DDR SDRAM), the GeForce MX (32MB SDR, 16MB DDR), etc.

    ATI has a similar line-up. For example, ATI has the 166MHz Radeon core and the 183MHz Radeon core. They are split into three retail categories: Retail 64MB Radeon (183MHz, Rage Theater included), Retail 32MB Radeon (166MHz, no Rage Theater), Retail All-In-Wonder 32MB Radeon (166MHz). And a myraid of OEM products: OEM 64MB Radeon (166MHz, no Rage Theater available yet), etc. These also address different price points. Actually, I consider the All-In-Wonder a distinct product because of its TV Tuner and integrated video capture features.



    I think I didn't get the point of this stupid discussion.
    Is it Sharky's problem or fault, that 3dfx stagnates in developing new hardware, so only one interresting product (theV5) is out to test, but 20 GTS vidcards from different manufacturers are to be checked out to find the best card.
    And as you already mentioned, ATI has announced many versions of their Radeon line of products, but they are not all out to test yet. If they are, Sharky will surely test them. And if 3Dfx will put out it's tenthousandtimesannouncedanddelayed V5 6000 they will also test it. And in the year 2300 when the "Rampage" comes out they will...you'll guess it. So wtf is your problem guys? Did you buy the wrong vidcards?


    By alexross October 10, 2000, 04:18 AM

    Originally posted by GTaudiophile:
    Ok, today children, we're going to take 1 3Dfx card, 1 ATI card, and 6 FREAKING NVIDIA CARDS, and see which three are the best overall.

    Yes that is correct.

    Of course children, since we're using 6 cards based on a GPU from ONE company, the law of averages states that the best three will naturally be from THAT company.

    That may well be a fact and help the law of averages a little but we're not handing out awards for technology. When someone goes to Fry's and looks at the wall of video cards he doesn't see NVIDIA, 3dfx, ATI. He sees Creative, Leadtek, ASUS, Hercules, 3df, ATI etc... That was the point of the article. We've already covered who's technology does what so this article was much more about products from different companies rather than just the smaller perspective on chips.

    So, children, instead of giving the bronze to 3Dfx, the silver to ATI, and the gold to nVidia, we're going to blow sunshine up the ass of nVidia by giving them ALL THREE freaking awards.

    I'm not sure that I would see the use in doing an article based upon all the best cards we have in our lab that came in over the past couple of months and all of which are on the shelves of Fry's. One of the biggest questions I get is "Sharky, I need help choosing a video card. I walked into storeX and there are so many to chose from". That was the reasoning for the piece.

    Oh, and by the way children, our next lesson will be about how to cash checks from nVidia!

    I take that as an unecessary insult and quite unfair. I've said this before and will say so again, we provide a free service to our readers and/or anyone that wants to find out more info on technology and products. We call it like we see it. That's all there is to it. You don't get to have the reputation we do amongst the hardware industry by being "slanted". There are some things that we do not do as well as I'd like and could do better in many areas but being "more ojective" isn't one of them. That is the first thing any writer at SE has drilled into them from day one. I understand that a high profile site can get attention like the sort you are accusing us but I have to disagree with you.

    WHAT A BUNCH OF BS, SHARKY!!! Do you actually think we're THAT dumb here??? We are not blind!

    No not at all. I think this forum contains some of the best and most technical people on the NET. I am very impressed and the quality of the readership never ceases to amaze me. I have learned a lot from it, a lot of it from this very forum.

    Just because the GF2-Ultra was delayed and ATI announced a roadmap that will destroy nVidia's crown doesn't mean you all should try pulling the blanket over our eyes!

    I think you are missing the point again. This article has nothing to do with the really rather boring (doesn't everyone agree?) war that people like to flame each other on. The article is about product.

    I MEAN COME ON!!! The Radeon 32DDR bests the GF2 cards in one benchmark, stays within 4 FPS in the other benchmarks, has arguably the best 3D image quality, has the easily the best DVD playback ability, and COSTS $100 LESS THAN the competition!!! I mean, WTF???

    That's why I wrote what I wrote about it. It's not as close as you say in the benchmarks but it's a very fine board. I remember saying "Think of this article like the Olympic final. If you made it here there's no reason to be ashamed". I actually said the "Radeon was the best card not to have one a medal and perhaps should have done. If the price had not been subject to a rebate sceme (which not many people end up using) but an actual real 150 bucks then it would have. It has the best DVD and the performance is a little shy of a GeForce 2". I think you are saying the same thing as I did?

    The ATI Radeon 32DDR is CLEARLY a best videocard, and just because of your damned bias article, I will buy an ATI Radeon INSTEAD of a GF2MX. So shove that!!!

    Great! Good choice and I hope you enjoy it.

    [/QUOTE]

    By alexross October 10, 2000, 04:24 AM

    Hi Troop,

    The Windows Desktop might be set to 16-bit 1024x768 in order to run test in but the actual tests were all run in 32-bit and we are well aware of how to force 32-bit over a 16-bit desktop (we are well trained and did so for every benchmark either via the contorl panel and/or the registry. Easy to do and we did that.

    quote:Originally posted by troop:
    Yeah, this is an article with a huge flaw but using several identical video adapters is not it. That is only redundant. The flaw is contained here :

    "All benchmarks were run with a desktop color depth of 16-bits,"

    and then we see this nonsense :

    "The GeForce2-based cards rule the table here in 16 and 32-bit color "

    Stupid .


    And the importance of this error is shown in the result of the only 32bit test results displayed :

    "Re-Volt 1600 x 1200 x 32

    ATI's Radeon dominates on both of our performance platforms,scoring 60.2 on the Intel system and 60.1 on the AMD machine"


    The test purports to be a complete analysis yet mostly analyzes 16 bit performance showing one adapter the clear winner. But at the resolution most power users employ--32bit--another card dominates.


    So what's the point of this effort ?

    By colonel October 10, 2000, 04:44 AM

    In Australia we have a big car rivalry between ford (who make the falcon) and holden. (who make the commodore or the lumina has u americans know it)Our equivalent of nascar is touring car and the falcons and commodores race each other. The biggest race of the season is the bathurst 1000 and everyone watches it. The next day the winner gloats and the loser makes up stupid excuses about how it was rigged and how one or the other is still better
    thanks sharky for doing exactly what i asked you to do in a previous post its just a shame that people can't take it when their card loses. Whats sad is that we're talking about $250 video cards here not 30 000 dollar cars and i find the car arguments sad

    By MafiaBoss October 10, 2000, 05:25 AM

    Well I am wondering why some people are angry and throwing words here and there
    I will speak as a consumer but not as a fan, Sharky did the right thing you ask why ?
    first now we know the software bundle that comes with each card.
    Second we now know the best card right for us and our needs.
    And about having six cards using the same GPU and why they did this ?
    It reminds me of something, why pcmag and some others test
    Different computer systems with same CPU's?
    my point is do not make it difficult to judge just choose the one you like
    And that in my opinion what Sharky did they show us different cards
    we cannot go out and buy all the cards and test them and then we pick one!
    By the way now it's easy for anyone trying to buy a new card based on
    Nvidia GPU, or thinking to buy a Voodoo5 5500 or a Radeon.

    ---Thanx you Sharky for the shootout---

    For the flame makers !
    i use 2 cards on my 2 systems
    1-voodoo3 3000 AGP
    2- 3d blaster annihilator pro ddr 32.
    and im happy i love the voodoo3 and the geforce :P

    hahahahahahahah


    By the_writer921 October 10, 2000, 05:36 AM

    first i gotta say - i knew this was going to happen.
    second i have to thank YMaster and Rye for acually talking the time to prove your point, making it very clear about your thoughts. (((and after looking at this my rant was longer than i thought it would be)))
    Next, i have to thank Alex and the sharky team for writing the article.
    i have said this before, and i am saying it again...
    These guys work their asses off for us. they stay late, they do hours of testing, then write reviews, all to be cursed out by people, some of which DIDNT EVEN READ THE ARTICLE (for example the 64MB cards and V5 5500 which is 32MB per processor) Hell, like normal, most of the people who come here post a negative message. they rarely even get a THANK YOU for all the effort they put in. Instead, they get accused of being slanted, bribed, and unfair (just because your card didnt win)
    You didn't pay to read that article and you didnt pay to post those messages. If you didn't like the article, you could have stopped reading. Instead, you decided to keep on reading, knowing that no matter what the article said, that they were wrong and of course, you are right.
    You didnt have an open mind to what they said. I have no problem with you disagreeing with the results if you keep an open mind. Instead you come in here pissed that your card which REALLY is the fastest (even though 500 benchmarks could show otherwise) didnt win.
    In most OpenGL games, i would say a GTS would be fastest. For image quality, the Radeon is probably best. For older games or driving games, a V5 5500 is probably best. If you disagree with me, thats fine. But if you disagree, i would hope you would at least be respectful enough to show me why i am wrong, not to tell me i am a liar and swear in my face.
    I know that there are days when some of the sharky team doesn't leave until 2, 3, even 4 or 4:30 AM. They sleep a few hours, then come back and do it again. they work hard doing these reviews and articles. they are dedicated to doing a good job and all you can do is find faults. if anything, you should be thanking them for all their work, NOT accusing them. Like i said, show them the reasons why they are wrong, but you don't need to insult them.
    finally - remember that an insult is probably the lowest way of trying to make yourself feel above someone else. An insult on a forum is even lower.
    Alex and crew - thank you again for doing a great article, keep up the great work, and remember that at least most of us are extremely thankful that you guys put in all the work you do.
    GT & the others who don't like the articles and those who can only flame others - if you don't like the site, no one ordered you to come here. No one forced you to read the article. you don't have to read anything they say, since you don't have an open mind they must already be wrong, even before you read the article. if you don't like it so much, then don't read it.
    anyway - enough ranting - thanks again Alex & all the Sharky crew
    -the_writer921

    By Ben October 10, 2000, 06:10 AM

    I don't know why we're being blackballed for this. All we did was bring up a valid concern. The Voodoo5 and Radeon were at a statistical disadvantage from the very beginning. I still dont understand the reasoning behind the inclusion of 6 geforce 2 cards, when two would have sufficed. The review would have been fine if there had not been a 1st, 2nd and 3rd place winner, which are deceiving results in and of themselves. In order to promote fair competition, an equal amout of each chipset should have been included. Since ATI and Radeon manufacture their own cards, only one geforce 2 should have been chosen to compete IMO.

    And on another note, you cannot tell us to leave if we don't agree with an article. And the service is NOT free. When the advertisements are removed from the site, then and only then would it be "free." We pay by visiting the site, and clicking through a 20 page review. The fact of the matter is that many people find this review to be lacking in certain key areas. This is not an attack on sharkyextreme, but rather a critisism of what many feel to be an unfair review.

    By Ben October 10, 2000, 06:13 AM

    BTW, I love the "shut the hell up" line. Seems pretty narrow minded to me.

    By alexross October 10, 2000, 06:22 AM

    Thanks MafiaBoss (hey you a fan of The Sopranos then?),

    I am glad you saw the spirit of the article and found it useful. Keep it simple and to the point etc... that was the moto.

    quote:Originally posted by MafiaBoss:
    Well I am wondering why some people are angry and throwing words here and there
    I will speak as a consumer but not as a fan, Sharky did the right thing you ask why ?
    first now we know the software bundle that comes with each card.
    Second we now know the best card right for us and our needs.
    And about having six cards using the same GPU and why they did this ?
    It reminds me of something, why pcmag and some others test
    Different computer systems with same CPU's?
    my point is do not make it difficult to judge just choose the one you like
    And that in my opinion what Sharky did they show us different cards
    we cannot go out and buy all the cards and test them and then we pick one!
    By the way now it's easy for anyone trying to buy a new card based on
    Nvidia GPU, or thinking to buy a Voodoo5 5500 or a Radeon.

    ---Thanx you Sharky for the shootout---

    For the flame makers !
    i use 2 cards on my 2 systems
    1-voodoo3 3000 AGP
    2- 3d blaster annihilator pro ddr 32.
    and im happy i love the voodoo3 and the geforce :P

    hahahahahahahah

    By alexross October 10, 2000, 06:24 AM

    Heya Bloodred,

    The site was crippled yesterday as there seemed to be a software problem and overload of users. I think that's been fixed so give it a try today. We've added a bunch of stuff based upon the feedback both here and in Email that we got on the first day.

    quote:Originally posted by BloodRed:
    First off, I haven't read the article because I can't get to SharkyExtreme here at work. Damn proxies! Anyway....

    Maybe he felt he should compare ALL the products on the market. It's not Sharky's fault that nVidia has released three cards in the time it's taken their competition to release one. This sounds like another, "Waaaa! My video card didn't win..." rant. I'm sure that you would have praised the review if there had been three ATI cards in it.

    By alexross October 10, 2000, 06:28 AM

    Well the title/subject you wrote certainly got attention

    But you are right, buying a video card is hard and should be a personal decision. At SE we're here to give you guys free advice and of course our experience with these products. That's all really... it's not rocket science.

    The best point you make is this "Numbers may mean more to certain people, whereas image quality could mean the world to someone else." and it's exaclty my feelings and what we wrote. That's why we had different categories like DVD, bundle, Image quality and FSAA etc...

    So all in all I think it sounds like we agree on all counts.
    quote:Originally posted by Just:
    Buying video cards should be a personal decision, sure you can get ADVICE from friends and experts such as those who roam this board, but it boils down the personel preference. Numbers may mean more to certain people, whereas image quality could mean the world to someone else.

    I hope those who read Sharky's shootout will take into consideration that benchmarks arent everything. Sure he had more GeForce chip-sets in there than other boards, he's only giving you guys an opportunity to see what they can do. It's free information people! Although I dont quite have the budget to buy a high-end card right now, if i did have the money, i would certainly have appreciated what Sharky did.

    No one says you gotta buy nvidia just cause they got a better review or whatever. If you like ATI of 3dfx more, then why even look at those "shoot-outs"? You'r minds are already made up!

    By alexross October 10, 2000, 06:29 AM

    quote:Originally posted by Wotan81:
    I totally agree with GTaudiophile!!!

    Sharky swims in murky nVidia waters! How much did they pay You?

    The same as every other hardware web site and PC mag. They send an eval board out with an Embargo date. Amazing what some companies will do to get some coverage eh?

    By alexross October 10, 2000, 06:37 AM

    Heya Colonel (love the land of Oz btw),

    Thanks for the kind words and an interesting analogy there. I see exactly what you mean. It's something I have come to except, people do get emotional about their 3D hardware and defensive in a similar way to someone slagging off their car, stereo, tv etc... I guess they've become THAT important. I know from being an early technology adopter myself that I have made some good purchases and some crap ones in my time. That is how I got into this... I still remember in Novmember 1996 searching high and low for both a PowerVR gen1 card and a Voodoo graphics card. I liked the technology so much I had to figure out a way of using both at the same time... That's how I started my first page and even then... it caused contreversy. All I wanted to do was to show people that you could use both at the same time. I wasn't a 3dfx fan or a PowerVR fan. I liked both cards and used both, which back then was like saying you're left wing and right wing at the same time. A no no... Times haven't changed much I've learned a lot since then though.

    Glad you liked the article.

    quote:Originally posted by colonel:
    In Australia we have a big car rivalry between ford (who make the falcon) and holden. (who make the commodore or the lumina has u americans know it)Our equivalent of nascar is touring car and the falcons and commodores race each other. The biggest race of the season is the bathurst 1000 and everyone watches it. The next day the winner gloats and the loser makes up stupid excuses about how it was rigged and how one or the other is still better
    thanks sharky for doing exactly what i asked you to do in a previous post its just a shame that people can't take it when their card loses. Whats sad is that we're talking about $250 video cards here not 30 000 dollar cars and i find the car arguments sad

    By alexross October 10, 2000, 06:44 AM

    Ben, if I could please ask you to read some of the posts I have made, I think it answers some of your very valid questions. The point of the article wasn't to say who's technology/chip is better but more from the standpoint of "I have 300bucks to spend and I'm going to Fry's. OMG, I just got here and they have a wall of different products, what do I do?".

    That is why we included those products. We do review technology etc.. and that's something we'll continue to do. We looked as very different categories and NOT just performance and/or visual quality. This isn't the kind of article to stir who's technology is better. That's an important point, yes. And in the spirit of the article, I wrote that nobody that own or is planning to own one of the cards in the race should feel bad. It's like getting to the Olympic final 100M or something. As for the 64MB card from ATI, that's going straight into our Super-High End round-up. It was too pricey, like the 64MB GeForce2's and the Voodoo5 6000, which isn't out yet (not the GF2 Ultra).

    This was a product round-up and I know it wasn't prefect but I can tell you we did our best and I'm proud of the gang for all those late nights last week (had to get that in there)...

    quote:Originally posted by Ben:
    I don't know why we're being blackballed for this. All we did was bring up a valid concern. The Voodoo5 and Radeon were at a statistical disadvantage from the very beginning. I still dont understand the reasoning behind the inclusion of 6 geforce 2 cards, when two would have sufficed. The review would have been fine if there had not been a 1st, 2nd and 3rd place winner, which are deceiving results in and of themselves. In order to promote fair competition, an equal amout of each chipset should have been included. Since ATI and Radeon manufacture their own cards, only one geforce 2 should have been chosen to compete IMO.

    And on another note, you cannot tell us to leave if we don't agree with an article. And the service is NOT free. When the advertisements are removed from the site, then and only then would it be "free." We pay by visiting the site, and clicking through a 20 page review. The fact of the matter is that many people find this review to be lacking in certain key areas. This is not an attack on sharkyextreme, but rather a critisism of what many feel to be an unfair review.

    By Althor October 10, 2000, 07:14 AM

    quote:Originally posted by alexross:
    Heya Colonel (love the land of Oz btw),

    "It's something I have come to except, people do get emotional about their 3D hardware and defensive in a similar way to someone slagging off their car, stereo, tv etc... I guess they've become THAT important. .....I wasn't a 3dfx fan or a PowerVR fan. I liked both cards and used both, which back then was like saying you're left wing and right wing at the same time. A no no... Times haven't changed much I've learned a lot since then though."

    Glad you liked the article.


    I think that's what it's all about. It seems that people take some things too personal, althought we're just talking about videocards, and not politics, religion or other "serious" stuff.
    And if it comes to finding out which videocard is the best for you. You're on the right site here at Sharky. But Sharky doesn't take the decisions for you guys!
    The reviews are just an indicator of which card does what, 'n how much.
    If you ask me, I'd like to have a V5 a Radeon and a GTS, cause they're all good cards. But for I'm not a millionaire, I gotta make a decision ( I bought the GTS and I'm happy with it, which doesn't mean the other vidcards are bad, but the GTS fits my needs best).
    That's what the guys from sharky do, when they test hardware: make a decision which part they would prefer, if they were you. And they do that job pretty well, and actually helped me make my decision.

    By Ben October 10, 2000, 07:35 AM

    Sharky,
    Thank you for the detailed responses. I respect the fact that you admit the review was not perfect, and wanted to let you know that I appreciate the work that goes into a hardware site such as this.

    Would you mind addressing the "image quality" score in the roundup? I was under the impression that the Radeon had better image quality across the board(no pun intended), but in the review it receives the same score(9) as the geforce 2's. Thanks,

    Ben

    By FreakyG October 10, 2000, 10:06 AM

    TO EVERYONE THAT THINKS SHARKY IS UNFAIR TO THE RADEON, PLEASE READ THIS POST THAT HE MADE TO ME, RESPONDING TO MY COMMENTS.

    His responses are in bold.

    ____________________________________________

    My original post.........

    "You have to remember that this was only the 32MB Radeon, not the 64MB version. There is a greater difference between the 32MB and 64MB Radeon than the 32MB and 64MB GeForce2."

    "Good point and something that you'll see us cover in the next one soon."

    "In fact, the extra memory really pays off with the 64MB Radeon performance, but the jump from 32MB to 64MB on the GeForce2 shows very little improvement."

    "Yeps. That is true. There is some benefit when using games with lots of textures etc...but not much."

    "In 32bit the Radeon with the very first drivers at launch WAS INDEED beating the GeForce2 that had already had many driver updates, so the Radeon clearly started out more powerful than the GeForce2. Nobody is disputing that.

    However, the GeForce2 cards have had yet another driver update (Det3's), and have caught up, and even squeaked ahead by a small margin."

    "It's small and depending on who you ask it ranges from minute to significant."

    "But this is actually quite a compliment for the Radeon when you consider that it's still very young, and it's taken the GeForce2 cards MANY driver updates to equal what the Radeon was doing right out of the box when it launched.

    You see what I mean?

    Remember, the Radeon is only beginning, and has not even stretched it's legs yet."

    "I think that is a fair comment."

    "I admit the "Sharkman" seems biased towards Nvidia sometimes, however I don't think this round-up is one of those times."

    "Thanks for the kind words but the site is neither biased not slanted towards NVIDIA. We tell it like it is and the benchmarks do not lie. I care not for company politics etc... and am all about letting you folks know what we feel is the best hardware in town. Simple as that."

    "I like Sharky, but I also think he has been a little hesistant to give the Radeon the credit it deserves. *ahem* You are working on that aren't you Sharky?

    "Actually I like the Radeon a lot. It REALLY suprised me when I tested it. I think it suprised a lot of people. And for 150 bucks.. well as I said in the article, it's maybe the best $150 you'll ever spend on an upgrade."

    "In any event, the Radeon is an awesome card, and in most people's opinion, the best all-around card available."

    "As far as the all-round cards go.. their All-in-Wonder certainly holds that title and depending upon what type of performance you're looking for and the price you are willing to pay for it.. well you are right. If you set a limit of 150 bucks you can do no better and are in the same league as a GeForce 2. The DVD is slightly superior too even if it is slightly slower than a GF2 in Quake 3, which is important to many people."

    "Just remember, you are seeing the results of MANY Nvidia driver updates and tweaks, and several months more of maturity than the Radeon has had yet."

    "Right"

    "When the Radeon has an equal time to mature, it will only get better too. Not to mention all the dirextX-8 features that will give it an advantage in the newer games on the horizon. And of course, the excellent image quality, and other special features."

    "DX8 and whenever it actually comes out will cause a real shake up in the 3D graphics world. Only the strongest will survive etc..."

    "Personally, I think that if you are lucky enough to have either a Radeon OR GeForce2, you should feel happy-happy-happy. They are both great cards, and will last a long time."

    "That is EXACTLY my point and I said it at least 3 times over in the article. I know people can get emotional about 3D cards and so that is why I said things like "If your card made it into this round-up consider it like the 100M Olympic final. You're good. Very good etc."

    "With the ATI and Nvidia cards both being so great, perhaps the real question should be...."What the hell is 3dfx thinking, trying to sell their card at a higher price than Radeon OR GeForce2, when it performs significantly less than either of them?"

    "The price of the Voodoo5 was taken into account and points docked."
    _____________________________________________


    There you have it. Does that sound like a guy who is trying to be unfair to the Radeon? I think NOT !

    Sharky really likes the Radeon cards, and has a lot of good stuff to say about them. He gives credit where credit is due, and even agrees that the Radeon is just getting warmed up, and has a very bright future indeed.

    Remember people, the GeForce2 has had a lot more time to mature than the Radeon. But look at how strong the Radeon started. Just imagine when it matures too !

    Listen, I am the proud owner of a 64MB Radeon, and I think the Radeon's are the greatest thing since sliced bread. So if the review doesn't offend me, it certainly shouldn't offend anybody else.

    I think the Sharky-Bashing is out of hand. Just read his comments to my post and tell me where exactly where you see him being unfair.

    Also....If I prefer Porsche's to Ferrari's, does that mean Ferrari's suck? Hell no, it's a Ferrari for pete's sake.

    And visa versa.

    If you have, are will soon have, a Radeon OR a GeForce2 in any flavor, you should consider yourself lucky. Damn lucky.

    I don't post very often, and Sharky certainly doesn't need me to defend him, but I think some of you guys are being unfair. This was worth breaking my 2 posts a month rule. Sharky is a good guy.

    Now go play some of the games that you bought your card for to begin with.

    Have some fun .................DAMNIT !


    By Robo October 10, 2000, 10:29 AM

    1) I find it interesting that you declared the 5500 "a dog".

    2) I find it interesting you conveniently skipped any Unreal Tournament or Deux Ex gaming comparisons. God forbid you do that, the 5500 owns in those games.

    3) I find it interesting that the only D3d game you tested was an old DX6 game that CONVENIENTLY exhibits an issue with the 1.03 drivers, so you had to go back to 1.01, which CONVENIENTLY exhibits an issue with 1600x1200 in D3d.

    4) I find it interesting you neglected to compare any type of framerates with any racing or flight sims, other than an old-ass DX6 game that is CPU-limited up to 1600x1200x32. yeah, that'll show a big difference.

    5) I find it interesting that you could've used FRAPS with NFS:PU, or you could've used NASCAR Heat (built-in timedemo), or you could've used Evolva. All of these games are DX7-capable, and Evolva even has HW T&L support. Hey!!! That should show just how superior T&L is (hahaha...) Yet you chose an archaic game (one of the only) that has a compatibility problem with the 3dfx card.

    Hey, if Revolt wasn't old enough, why not try TombRaider??? WTF, let's go all the way back to Doom!

    Your anti-3dfx bias is striking. Not sure what happened, perhaps Brian Burke screwed your girlfriend or something, but your slams are so transparent, and your bias so strong, you are becoming a joke.

    used to be a damn good hardware site. Emphasis on *used*, i.e. past tense.

    By Adisharr October 10, 2000, 11:51 AM

    quote:Originally posted by toonzwile:

    You guys are a bunch of dumb***es... nvidia has the most products out to market right now, and each designer brings new things to their product... so u get different varieties. If ATI didnt have the $$ foundation, im sure there would be a bunch of Radeon cards floating around, and same goes for 3dfx. ATI and 3dfx are both larger companies than nvidia, and have chosen to produce their own products without outside manufacturers. Nvidia didnt choose that path. 6 months from now, if ATI holds to their roadmap, there will be 6 new video cards from ATI, hopefully an NV20 product, and -- this is by far the LONG SHOT -- a Rampage product from 3dfx. When Sharky decides to do a review and they include 6 Radeon cards with 1 from Nvidia & 3dfx each, then WTF are u idiots gonna say??!

    * Sorry for the language, Sharky.


    You are incorrect in your analysis. All the nvidia cards are using basically the SAME CHIP AND DESIGN. You can tell by OBVIOUSLY looking at the performance benchmarks. It would be as pointless to review 6 ATI or 3dfx cards if they also used the same chip in the same configuartion.

    If you haven't been around, generally if multiple manufacturers are using the SAME chip in the same configuartion with the SAME speed memory, the performance will be about the SAME. Make sense? I hope.

    $ .02

    Oh, if they were used an MX, GF2 and GF in the review, THAT would be using different cards - not the six clones. If you gonna call people dumbasses, at least know WTF your talking about..

    By Whirlwind October 10, 2000, 12:51 PM

    I don't know if the following has been addressed yet, so I'll ask them anyways:

    1) Where did you get the reference image, and isn't the reference image correct automatically for the card it was originally taken from?
    2) Benchmark setup - did you run the benches in DOS with 4mb RAM using the beta drivers for the cards you wanted to not with? Also what bios settings did you use (AGP apeture, memory settings? What tweaking did you do on each card? What motherboard did you use? Were any of the cards O/C'd? Why did you use 8 or so reference design Geforce2 GTS'? How do you know the cards were running at their default core speeds? Did you run each card on the same install of the OS or did you wipe and reinstall the lot each time - in other words, how clean was the OS you installed the cards on? How many times were each test ran?
    3) I was going to attempt to compare the new results with the old results for the cards using the inital review of the cards. I can't you used 1ghz CPUs (which only about 0.1% of the gamers use) and anywhere from 700mhz-800mhz for the initial card reviews. Is there a reason you ran them on 1ghz CPU's?
    4) Do you really think that your 'comparison' will be taken serious when you are so blantantly biased agains anything not nVidia? I am not biased against any card (I have a SLI rig of V2's, a Geforec2 MX on order, and a pair of ATI's running until the Geforec MX comes in. I buy the card that fits my bill. You review is too 'black box' to be given credit and you also fail to note that in reality, 0.1FPS for the spread between the GTS cards make having a variety of them pointless as does giving them their own section pointless when a simple chart with software bundles, prices, and manufacturer would have sufficed.
    5) How much gratuity is nVidia floating your way in terms of financing, free products, and having the guy come by and tweak your machines for the review - or in layman's terms stacking the deck with free technical expertise?

    By toonzwile October 10, 2000, 01:08 PM

    quote:Originally posted by Adisharr:

    You are incorrect in your analysis. ... If you gonna call people dumbasses, at least know WTF your talking about..

    Boy what a difference a day makes. OKAY!! So i ranted yesterday, and started calling ppl names and what not. I just felt the comments were unjustified, but my argument, while the intention was good and the thought was clear, didnt come out the way i wanted it to. So, i was wrong on just about everything. Thanks to bclothie & Adishaar for pointing this out and letting everyone know just how wrong i was

    However, it was still wrong of u ppl to rant on abt GF2s and why ur Radeons or V5s were at disadvantages. UT for a benchmark? Well, im sure somewhere on these Forums there's an explanation as to why UT isnt used as a benchmark (something about all platforms getting poor performance regardless)... so while UT might fly with Glide (which is less GPU/CPU intensive than OpenGL, upon which its based) and all other cards are forced to use the afore-mentioned complicated OpenGL, theres a certain disadvantage to the Radeon and GF2s there.

    Radeons shine at 32bpp settings? Yes, they do. And yes, Radeons are still in the 'infancy' stage and can only get better. However, u should all know that they always do 32bit testing on these sites (Sharky's, Thresh's, etc). Ur comparing chipsets here, and thats not what the review was abt. The GF2 has been around for how long? and how many driver updates and revisions has the core gone thru? The Radeon, like i just said, is still young. U get ticked off when it doesnt beat a mature chipset, cuz u know that with time it will beat the GF2. I know that too. But that time hasnt come yet... give it a few months. Wait til DX8 comes out... whatever. But stop whining because right now the card's full potential hasnt been realized, and so it loses to a chipset that ur prolly just tired of seeing take the crown. this is the video card market! things are never etched in stone. 3dfx had the crown for a long time... now NV has the crown, and if ATI keeps with the roadmap, it seems that will prolly have the crown soon.

    sorry if it seems like i was putting down the Radeon and the V5. i was extremely surprised when the Radeon was unveiled. I give props to ATI for bringing a true competitor to the scene, and everyday, it seems more and more like ATI's competitor will become the leader. and maybe i am a bit biased against the V5, but just abt everyone should admit that 3dfx has some problems. they relied on their wonderful Glide support... that always gave them an edge over other video chipset manufacturers. well, times are changing. Glide is starting to fade away... video chipsets are getting powerful enough to use true OpenGL instead of a miniGL driver. All video chipset manufacturers are pushing their own 'next-gen' technologies and features... T&L, which hasnt shown its full potential yet... FSAA, which is currently showing its potential... i dont want to upset any ATI fans, so im not going to say anything abt their Charisma Engine, because i dont know anything about it, other than it supports a lot of DX8 features.

    Going back to 3dfx... their T-Buffer technology, FSAA aside, still isnt much more than paperware, with a couple of screenshots here and there, maybe a demo of the possibilities. But maybe this is where my bias becomes apparent: 4 VSA processors SLI'd, each with its own 32mb buffer (which is deceiving to the user who thinks hes getting 128mb buffer), so power-hungry they require a separate PS, and the end result is a card that makes some old VL Bus video cards look short in comparison. If u think about just the performance hit that 3dfx is having with 4x FSAA, u can imagine what kind of performance hit even the V5 6000 is going to take if u enable 4x FSAA, Motion Blur, Depth of Field, Soft shadows & soft reflections! IMO, theyre offering an unrealistic view of what a game could be like. theres no way the game would be playable (at least 30 fps) at a resolution of at least 1024x768x32 if they enable all these features... not unless u have a system running a cpl of GHz, with 256mb of AGP texturing set in the BIOS -- oh i forgot! 3dfx doesnt do AGP texturing -- well, u see where this is going. I give props to 3dfx for their FSAA, its truly the best visual quality right now... but then theres the very crisp and sharp visual quality of the Radeon... and what happens if ATI developers decide to tune their drivers to offer better FSAA performance? then which one will be better at FSAA? 3dfx or ATI?

    For every argument u can make abt any chipset and its features, theres something else the other chipsets have that will always be better... this is progress. u dont know what the future will hold, so theres no point having all these arguments about ur video card(s) and how much they kick ass today... cuz tomorrow, today will be yesterday... and something better will come along.

    Well, since ive quit ranting, have gone from apologizing for my outburst yesterday, to explaining what i think about the different complaints, to showing my thoughts on todays newer products, back to bashing 3dfx, and then finally speculating on the future, i think i will end this. i think im going crazy

    By joeyd October 10, 2000, 01:24 PM

    if this was a shoot out between 32meg cards where is the mx ? its the same price point as the 32meg ddr radeon ? how about the voodoo 4 , the vivid ? and why the heck was it so long.... shesh it was like a novel .... good thing though gave me somethign to do during rpg

    By GTaudiophile October 10, 2000, 01:26 PM

    I knew this was going to be a long thread...

    Sharky said this article was written for the person who walks into Frys with $300 and has to choose between all the available videocards. I have to ask, in a world that can shop on the Internet, when is that not the case?

    I still think the nVidia videocards should have been reviewed separately, and the winner of that group pitted against the V5 and Radeon.

    Can anyone think of an Olympics race that included 6 Americans, 1 German, and 1 Australian? Well, I can't. I still think, despite all the posts here, that the "shootout" was more of a bombing with an nVidia slant to it. I can hear nVidia's marching orders now: "Sharky, our bottom line is being hit because of the Radeon. Actually, sales of the Voodoo5 are quite strong as well. We want you all to write an article that will put us on top in the eyes of all your readers."

    I will no longer give a shred of trust to any Sharky review.

    BTW, I've NEVER owned an ATI videocard. My first setup included an STB Velocity 128zx with an STB Blackmagic 12MB Voodoo2. My current setup has a TNT2-Ultra.

    By 100%TotallyNude October 10, 2000, 01:43 PM

    Opinions are like you know what... heres mine:

    quote:Originally posted by GTaudiophile:
    Sharky said this article was written for the person who walks into Frys with $300 and has to choose between all the available videocards. I have to ask, in a world that can shop on the Internet, when is that not the case?

    What difference does it make?

    I still think the nVidia videocards should have been reviewed separately, and the winner of that group pitted against the V5 and Radeon.

    As has been previsouly stated; why review a Nissan and a Mitsubishi separately when they both have the same 4 Cylinder Korean engine in them?

    Can anyone think of an Olympics race that included 6 Americans, 1 German, and 1 Australian? Well, I can't.

    You're kidding, right? What in the world does that statement mean? If you want to make a comparison to the 6 americans wearing nikes, the german wearing a rocket pack, and the autralian wearing his bare feet, you might be making some kind of a point there.

    I still think, despite all the posts here, that the "shootout" was more of a bombing with an nVidia slant to it. I can hear nVidia's marching orders now: "Sharky, our bottom line is being hit because of the Radeon. Actually, sales of the Voodoo5 are quite strong as well. We want you all to write an article that will put us on top in the eyes of all your readers."

    Look, you're high. I've read the review too, and all I can say is that if anything, the SE staff have bent over backwards to be even tempered about the whole process.

    I will no longer give a shred of trust to any Sharky review.

    Hey, thats your option. And you paid what for the review? And the review cost how much in blood, sweat, tears, money? You can have a strong disagreement with anyone, I don't care about that. I just want to understand YOURS to some extent.

    By Bateluer October 10, 2000, 03:26 PM

    quote:Originally posted by GTaudiophile:
    Ok, today children, we're going to take 1 3Dfx card, 1 ATI card, and 6 FREAKING NVIDIA CARDS, and see which three are the best overall. Of course children, since we're using 6 cards based on a GPU from ONE company, the law of averages states that the best three will naturally be from THAT company. So, children, instead of giving the bronze to 3Dfx, the silver to ATI, and the gold to nVidia, we're going to blow sunshine up the ass of nVidia by giving them ALL THREE freaking awards. Oh, and by the way children, our next lesson will be about how to cash checks from nVidia!

    WHAT A BUNCH OF BS, SHARKY!!! Do you actually think we're THAT dumb here??? We are not blind! Just because the GF2-Ultra was delayed and ATI announced a roadmap that will destroy nVidia's crown doesn't mean you all should try pulling the blanket over our eyes! I MEAN COME ON!!! The Radeon 32DDR bests the GF2 cards in one benchmark, stays within 4 FPS in the other benchmarks, has arguably the best 3D image quality, has the easily the best DVD playback ability, and COSTS $100 LESS THAN the competition!!! I mean, WTF???

    The ATI Radeon 32DDR is CLEARLY a best videocard, and just because of your damned bias article, I will buy an ATI Radeon INSTEAD of a GF2MX. So shove that!!!


    Sharky used 6 Nvidia based cards because Nvidia sells their chips to other companies to make cards based on their chip. Both 3dfx and ATi keep their chips in house, and so their is only ONE type of Radeon chip, and ONE V5 board. ATi did change the speed of the ram on their boards, but that hardly requires that they make serious changes to the 3d chip itself. 3Dfx simply adds more VSA-100 chips on a single board.

    By alexross October 10, 2000, 04:05 PM

    Thanks Altor,

    I could not have said it bette myself. Really. You get the whole point of the site and I love that. It's a resource and I am never going to let it turn into "you must do this or buy that". Then I would have to change the name to AMD Extreme, Intel Extreme and/or 3dfx Extreme. It's objective but it just shows our experiences with products and technology. It's always up to the reader to decide what they buy, I'd never want to take that away and I'm not arrogant or stupid enough to think that our site ever could. This forum clearly shows that the readers are all very knowledgable and are quite capable of making these decisions for themselves.

    quote:Originally posted by Althor:
    [QUOTE]Originally posted by alexross:
    [b]Heya Colonel (love the land of Oz btw),

    "It's something I have come to except, people do get emotional about their 3D hardware and defensive in a similar way to someone slagging off their car, stereo, tv etc... I guess they've become THAT important. .....I wasn't a 3dfx fan or a PowerVR fan. I liked both cards and used both, which back then was like saying you're left wing and right wing at the same time. A no no... Times haven't changed much I've learned a lot since then though."

    Glad you liked the article.


    I think that's what it's all about. It seems that people take some things too personal, althought we're just talking about videocards, and not politics, religion or other "serious" stuff.
    And if it comes to finding out which videocard is the best for you. You're on the right site here at Sharky. But Sharky doesn't take the decisions for you guys!
    The reviews are just an indicator of which card does what, 'n how much.
    If you ask me, I'd like to have a V5 a Radeon and a GTS, cause they're all good cards. But for I'm not a millionaire, I gotta make a decision ( I bought the GTS and I'm happy with it, which doesn't mean the other vidcards are bad, but the GTS fits my needs best).
    That's what the guys from sharky do, when they test hardware: make a decision which part they would prefer, if they were you. And they do that job pretty well, and actually helped me make my decision. If we can give you our experiences and that helps any of you then that's just great.


    [/B][/QUOTE]

    By alexross October 10, 2000, 04:13 PM

    Hi ben,

    Thanks for taking the time to read my response and for seeing where I am coming from. In terms of the image quality I'll talk to Chris about it (that was his section) and even though the shots didn't show it... The image quality scores weren't wrong I believe. In theory there are some better techniques used by the Radeon but in things like 3dmark and/or to the general user it doesn't show up. Where there's a dramatic affect is in things like FSAA. That's instantly noticeable. If we were to pin point and zoom in on some images, yes you are correct, there are places where the Radeon looks better. My point is though that nobody apart from the most technical person would buy a Radeon and find this out and actually use it like this. The image quality score was based on some very simple criteria, things like 32-bit color etc..

    Either way I may not think the score is wrong but I will talk with the guys about it and bring up this post. Perhaps next time around we can be a lot more stringent on the image quality score and or accept that if there are more people like yourself that do zoom-in and put the technology under the microscope and look at white papers, well then yes the Radeon does do certain things cleaner if not necessarily faster (another trade off).

    Thanks for the kind words and the appreciation of the work we do.

    quote:Originally posted by Ben:
    Sharky,
    Thank you for the detailed responses. I respect the fact that you admit the review was not perfect, and wanted to let you know that I appreciate the work that goes into a hardware site such as this.

    Would you mind addressing the "image quality" score in the roundup? I was under the impression that the Radeon had better image quality across the board(no pun intended), but in the review it receives the same score(9) as the geforce 2's. Thanks,

    Ben

    By alexross October 10, 2000, 04:32 PM

    Originally posted by Whirlwind:
    I don't know if the following has been addressed yet, so I'll ask them anyways:

    1) Where did you get the reference image, and isn't the reference image correct automatically for the card it was originally taken from?

    >Incorrect. It's one that madonion uses by default and is done in software.

    2) Benchmark setup - did you run the benches in DOS with 4mb RAM using the beta drivers for the cards you wanted to not with?

    >As listed all the drivers were the latest downloadable and available ones on the web.

    Also what bios settings did you use (AGP apeture, memory settings? What tweaking did you do on each card?

    >No tweaking whatsoever except when drivers were FORCING 16-bit textures and we were trying to benchmark in 32-bit. I had to overide those settings to get a decent score.

    What motherboard did you use? Were any of the cards O/C'd?

    >We used two different motherboard (check the system specs and this will answer your question). No cards were overclocked and we pointed that out. Everything was run at the correct speed.

    Why did you use 8 or so reference design Geforce2 GTS'? How do you know the cards were running at their default core speeds?

    >Because those are the most popular products around. If 3dfx had 5 different Voodoo5's on sale I would have done the same thing. In fact I used to do that when there were 57000 Voodoo2's before they started making their own cards with STB.

    Did you run each card on the same install of the OS or did you wipe and reinstall the lot each time - in other words, how clean was the OS you installed the cards on? How many times were each test ran?

    >Very clean. We use DISK IMAGES from Norton Ghost. The only and best way to do things. The OS was clean that way EVERY single time and for exvery single card. It takes a while to do this but that way you get the best results.

    3) I was going to attempt to compare the new results with the old results for the cards using the inital review of the cards. I can't you used 1ghz CPUs (which only about 0.1% of the gamers use) and anywhere from 700mhz-800mhz for the initial card reviews. Is there a reason you ran them on 1ghz CPU's?

    >A good point. I wish we had the bandwith to do more of these tests on all sorts of CPUs. I really do. Perhaps in time when we grow we can cover more tests. For the meantime we do have a couple of articles on the way with lower end CPUs. All of our Voodoo4's, GeForceMX's etc.. are run on our ValuePC's.

    4) Do you really think that your 'comparison' will be taken serious when you are so blantantly biased agains anything not nVidia?

    >I am confident it will. It's not meant to be taken "seriously". It's not rocket science, it's just some benchmarks, some images and a large readership that decides for themselves.
    I am not biased against any card (I have a SLI rig of V2's, a Geforec2 MX on order, and a pair of ATI's running until the Geforec MX comes in. I buy the card that fits my bill.

    >Then you are also not exactly representative of the "0.1% of gamers" as you put it and hopefully you can now appreciate why we used the 1GHz processors. We wanted to push the cards. Anyway you must be in that 0.1% of gamers and thus I see where you are coming from.

    You review is too 'black box' to be given credit and you also fail to note that in reality, 0.1FPS for the spread between the GTS cards make having a variety of them pointless as does giving them their own section pointless when a simple chart with software bundles, prices, and manufacturer would have sufficed.

    >So you're saying that I should just write a round up with no benchmarks at all? Basically the performance, which was worth 20 out of the overall 100 points is useless in your opinion?
    5) How much gratuity is nVidia floating your way in terms of financing, free products, and having the guy come by and tweak your machines for the review - or in layman's terms stacking the deck with free technical expertise?
    >None, none and none whatsoever. The same as 3dfx and ATI, they submit there white papers, drivers and a new video card with every new release. I think it's called standard practise and that's about the size of it.

    >Thanks for taking the time to read this.

    By Ymaster October 10, 2000, 04:39 PM

    quote:Originally posted by alexross:
    Hi ben,

    Thanks for taking the time to read my response and for seeing where I am coming from. In terms of the image quality I'll talk to Chris about it (that was his section) and even though the shots didn't show it... The image quality scores weren't wrong I believe. In theory there are some better techniques used by the Radeon but in things like 3dmark and/or to the general user it doesn't show up. Where there's a dramatic affect is in things like FSAA. That's instantly noticeable. If we were to pin point and zoom in on some images, yes you are correct, there are places where the Radeon looks better. My point is though that nobody apart from the most technical person would buy a Radeon and find this out and actually use it like this. The image quality score was based on some very simple criteria, things like 32-bit color etc..

    Either way I may not think the score is wrong but I will talk with the guys about it and bring up this post. Perhaps next time around we can be a lot more stringent on the image quality score and or accept that if there are more people like yourself that do zoom-in and put the technology under the microscope and look at white papers, well then yes the Radeon does do certain things cleaner if not necessarily faster (another trade off).

    Thanks for the kind words and the appreciation of the work we do.

    Sharky...You still here?

    By alexross October 10, 2000, 04:45 PM

    Originally posted by Robo:
    1) I find it interesting that you declared the 5500 "a dog".

    >Well it's slower than the ATI can Geforce cards by quite a bit. That's it's only real weakness and for that kind of price we had to point that out.

    2) I find it interesting you conveniently skipped any Unreal Tournament or Deux Ex gaming comparisons. God forbid you do that, the 5500 owns in those games.

    >Yes and get into the debate of testing them under Glide vs D3D for the other cards? I don't think that's fair. Although we did write about this for the Voodoo5, saying that if you're into UT then the Voodoo5 and the GLIDE api win through. UT isn't a good benchmark which brings up another point and we've covered this before. It shows up almost all cards getting 25fps and is dependant upon CPUs. Ask around or try it for yourself. You'll see that a Savage 2000 is right up there with a Voodoo5 in UT if you bench under D3D. Clearly that can't be right. It's the number of BOTS...

    3) I find it interesting that the only D3d game you tested was an old DX6 game that CONVENIENTLY exhibits an issue with the 1.03 drivers, so you had to go back to 1.01, which CONVENIENTLY exhibits an issue with 1600x1200 in D3d.

    >I find it a shame too. I wish there really was something better than that. I have been asking developers for the longest time. I'd like something like NFS Porsche. As far as it being convient, well it wasn't at all. We made 3dfx aware of the problem straight away and they were happy we did so. The reason we chose that resolution was EXACTLY the for the same reason you pointed out. It's an old game. the engine is old. The game doesn't even push new video cards until you get to such a high resolution. For the record, we didn't dock the Voodoo5 any points for failing to run that benchmark because we are aware that it's not the best real world D3D game test. If you know of any real D3D games other than Unreal, which you suggest we run in Glide anyway, please let us know. For the record, DMZG, 3DMark etc.. are all non starters. They are not games.

    4) I find it interesting you neglected to compare any type of framerates with any racing or flight sims, other than an old-ass DX6 game that is CPU-limited up to 1600x1200x32. yeah, that'll show a big difference.

    >The only one I know of is Falcon 4. I put my hands up and say I know little about flight sims and I don't think many of our readers play them. If I am wrong, I stand corrected. However we did mention Falcon 4.0 and how good it looks on the Voodoo5 with FSAA turned on. Check the article.

    5) I find it interesting that you could've used FRAPS with NFS:PU, or you could've used NASCAR Heat (built-in timedemo), or you could've used Evolva. All of these games are DX7-capable, and Evolva even has HW T&L support. Hey!!! That should show just how superior T&L is (hahaha...) Yet you chose an archaic game (one of the only) that has a compatibility problem with the 3dfx card.

    >FRAPS is useless for what our readers want. Real frame rates not ones that are subjective and depend on the human eye. The want REAL frame rates. I think the sales for Evolva were attrocious but you are right there it is a DX7 game. Actually I would question that is was a game at all and whether or not anyone even knows what it is. We've looked at the NASCAR Heat demo before and have talked about including it. That I agree with.

    Hey, if Revolt wasn't old enough, why not try TombRaider??? WTF, let's go all the way back to Doom!

    >Those games came out in 1995 and 1996. I think that would be a waste of time no?

    Your anti-3dfx bias is striking. Not sure what happened, perhaps Brian Burke screwed your girlfriend or something, but your slams are so transparent, and your bias so strong, you are becoming a joke.

    >That would be really hard. I know Brian really well and he's a very likeable and decent person. He is one of the best people at his job, and if indeed you work for 3dfx, please pass that on. Even though I have told him many times. And since you know him by name I pressume you know him in person. He happens to be happily married with a child so I don't think he'd be too happy about that statement.

    used to be a damn good hardware site. Emphasis on *used*, i.e. past tense.
    >That's your opinion. And we're not hear to be popular. Just to tell it like it is...

    By alexross October 10, 2000, 04:50 PM

    Alive and swimming mate I knew an article such as the one we did would cause a stir. These kinds of things usually do with some people either because they get a little too emotional or just don't read the article fully. But yes, there are also some very valid points and I have to answer them in order to improve etc...

    quote:Originally posted by Ymaster:
    Sharky...You still here?

    By chrisangelini October 10, 2000, 05:06 PM

    Just wanted to quickly address a couple of issues.

    1) 5 people spent all day/all night 5 or 6 days in a row benchmarking and writing. That means there are 5 different opinions on where the points should have fallen, and the scores that were given are a compromise on those 5 opinions.

    2) Desktop image quality at 16-bits does NOT mean that benchmarks are run at 16-bit. Drivers request color depth, so when a benchmark is run at Normal Quality in Quake III, that means it is in 16-bit, because Quake III's setting is Default. In MAX Quality, Quake III specifically asks the driver for 32-bit and gets it. I repeat, all benchmarks that are specified as 32-bit, ARE 32-bit.

    3) There is a difference between 2D and 3D image quality, bclothie and Ben. You are comparing my assesment of 2D image quality in one article to my assesment of 3D in another. As to the decision to rate the Radeon and GeForce2 the same for 3D image quality, we did that because the GeForce2 matched the software rendered reference image the closest (therefore was the most accurate rendering) but the Radeon looked better, to me anyways). We DIDN'T base the score on 2D image quality, which is something we will try and take into consideration in the future. This can be seen distinctly in the XOR image comparisons. If these are not clear enough, I have the raw bitmap captures that I will send to anyone who would like to see for themselves.

    GTAudiophile, I've seen your posts at 3DSS and know you're an intelligent person. Just because a review isn't done in the way YOU would've, I'd hope you'd have more sense then to "no longer give a shred of trust to any Sharky review." There is no such thing as an article that makes everyone happy, but truth be told, there are REALLY many times more GeForce2 cards on the market than anything else. I really don't think that Olympic analogy worked though You don't hear Lithuania complaining because the United States won 39 Gold medals, right? Much less Thailand complaining that Lithuania didn't do as well as the United States

    Robo, flight sims and racing sims are largely CPU dependent and do not stress the fill-rate of current videocards, but I'm sure you already knew that. If you were trying to prove a point about FSAA, then you're preaching to the choir because the Voodoo5 DID get the highest FSAA score.

    If anyone has any questions about what happened, how we tested or why we rated like we did, my e-mail is chris@sharkyextreme. EMAIL ME, I will be more than happy to explain.

    Ciao!

    By chrisangelini October 10, 2000, 05:21 PM

    I'd also like to mention to the gentleman who brought up ATI's new roadmap - I've already spoken to ATI and the majority of that information is completely false, starting with the Radeon MAXX, which we'll talk about in an upcoming article.

    Ciao!

    By OOAgentFiruz October 10, 2000, 05:57 PM

    SHARKY!!!
    Okay whose idea was it to suddenly make registration a requirement for posting without much warning.Unfortunately I was on another comp,and I can't remember these cryptic passwords anyway(its saved on my comp) i wrote and lost a long ass reply about the the 8 video card review and contest/farce....where is it now ETHER!!!all gone. Fock That : )

    We'll in short if superjoe(educated joe) decides he does not want a geforce its radeon or v5. I.E its really a contest between the best of the cores....if superjoe wants DvD playback/image quality and does not play OpenGl exclusively will he buy the second or third place geforce?????NO probably a radeon....or if he plays ut and deus ex(yeh yeh D3D) or flight sims.......he'll choose "the you know what".

    I like the fact that you break down the points because quite frankly i couldn't care less about bundles.....and tossing that catagory out it is obvious to me that the radeon is a better choice : ) I bought one 32ddr its great and yes i read all your reviews which turned out to be helpful.At least the benchmarks are...opinions and comments are for the most part meaningless and only give the forum fans something to flame about. Chris usually has some worth while things to say but the rest of the shit is just that.

    It would have been better to scrap the whole medal tom foolery...and just post the points,medals are for V.Card Fans......who need to get a life : ) along with the stupid ass cpu fans(not golden orbs stupid): )

    By GTaudiophile October 10, 2000, 06:44 PM

    Sharky, sorry for being such a dumbass lately, but we all have our days. Your reviews on the whole are very well done, and I usually agree. If you haven't noticed already, I take brand loyalty to an extreme sometimes. No one can deny that nVidia makes pretty darn good videocards. I just think they are already resting on their laurels.

    By Adisharr October 10, 2000, 06:46 PM

    [QUOTE]Originally posted by rye222:
    [B]You know what Ben and GTaudiophile, shut the hell up! And the same goes for everyone else out there who has the balls to come to this site and TRY to put down this article (or any other article for that aspect).

    Ok Mr. 1 post.. try visiting here more often if your going to try to tell everyone else what to post and not post. Who do you think you are?

    Hate to break this to you - we post what we want to and as long as it conforms to 'forum use' policies your SOL breadboy

    By chrisangelini October 10, 2000, 06:53 PM

    GT,

    I'll say "thanks" for both Alex and I. I appreciate your opinion, of course, and just want to assure you and everyone else who may think so that there is no loyalty to one company or another for us. The boys in San Jose use a GeForce2 Ultra (those hardcore gamers), my lab uses Radeon's for the most part (DVD's are the ONLY thing to do while you're 6th KT133 board runs SYSMark in the other room), and none of us have any problem with the Voodoo5 other than it doesn't match OUR needs very well. Thanks for understanding, though

    Ciao!

    By Adisharr October 10, 2000, 06:54 PM

    quote:Originally posted by toonzwile:
    Boy what a difference a day makes. OKAY!! So i ranted yesterday, and started calling ppl names and what not. I just felt the comments were unjustified, but my argument, while the intention was good and the thought was clear, didnt come out the way i wanted it to. So, i was wrong on just about everything. Thanks to bclothie & Adishaar for pointing this out and letting everyone know just how wrong i was

    However, it was still wrong of u ppl to rant on abt GF2s and why ur Radeons or V5s were at disadvantages. UT for a benchmark? Well, im sure somewhere on these Forums there's an explanation as to why UT isnt used as a benchmark (something about all platforms getting poor performance regardless)... so while UT might fly with Glide (which is less GPU/CPU intensive than OpenGL, upon which its based) and all other cards are forced to use the afore-mentioned complicated OpenGL, theres a certain disadvantage to the Radeon and GF2s there.

    Radeons shine at 32bpp settings? Yes, they do. And yes, Radeons are still in the 'infancy' stage and can only get better. However, u should all know that they always do 32bit testing on these sites (Sharky's, Thresh's, etc). Ur comparing chipsets here, and thats not what the review was abt. The GF2 has been around for how long? and how many driver updates and revisions has the core gone thru? The Radeon, like i just said, is still young. U get ticked off when it doesnt beat a mature chipset, cuz u know that with time it will beat the GF2. I know that too. But that time hasnt come yet... give it a few months. Wait til DX8 comes out... whatever. But stop whining because right now the card's full potential hasnt been realized, and so it loses to a chipset that ur prolly just tired of seeing take the crown. this is the video card market! things are never etched in stone. 3dfx had the crown for a long time... now NV has the crown, and if ATI keeps with the roadmap, it seems that will prolly have the crown soon.

    sorry if it seems like i was putting down the Radeon and the V5. i was extremely surprised when the Radeon was unveiled. I give props to ATI for bringing a true competitor to the scene, and everyday, it seems more and more like ATI's competitor will become the leader. and maybe i am a bit biased against the V5, but just abt everyone should admit that 3dfx has some problems. they relied on their wonderful Glide support... that always gave them an edge over other video chipset manufacturers. well, times are changing. Glide is starting to fade away... video chipsets are getting powerful enough to use true OpenGL instead of a miniGL driver. All video chipset manufacturers are pushing their own 'next-gen' technologies and features... T&L, which hasnt shown its full potential yet... FSAA, which is currently showing its potential... i dont want to upset any ATI fans, so im not going to say anything abt their Charisma Engine, because i dont know anything about it, other than it supports a lot of DX8 features.

    Going back to 3dfx... their T-Buffer technology, FSAA aside, still isnt much more than paperware, with a couple of screenshots here and there, maybe a demo of the possibilities. But maybe this is where my bias becomes apparent: 4 VSA processors SLI'd, each with its own 32mb buffer (which is deceiving to the user who thinks hes getting 128mb buffer), so power-hungry they require a separate PS, and the end result is a card that makes some old VL Bus video cards look short in comparison. If u think about just the performance hit that 3dfx is having with 4x FSAA, u can imagine what kind of performance hit even the V5 6000 is going to take if u enable 4x FSAA, Motion Blur, Depth of Field, Soft shadows & soft reflections! IMO, theyre offering an unrealistic view of what a game could be like. theres no way the game would be playable (at least 30 fps) at a resolution of at least 1024x768x32 if they enable all these features... not unless u have a system running a cpl of GHz, with 256mb of AGP texturing set in the BIOS -- oh i forgot! 3dfx doesnt do AGP texturing -- well, u see where this is going. I give props to 3dfx for their FSAA, its truly the best visual quality right now... but then theres the very crisp and sharp visual quality of the Radeon... and what happens if ATI developers decide to tune their drivers to offer better FSAA performance? then which one will be better at FSAA? 3dfx or ATI?

    For every argument u can make abt any chipset and its features, theres something else the other chipsets have that will always be better... this is progress. u dont know what the future will hold, so theres no point having all these arguments about ur video card(s) and how much they kick ass today... cuz tomorrow, today will be yesterday... and something better will come along.

    Well, since ive quit ranting, have gone from apologizing for my outburst yesterday, to explaining what i think about the different complaints, to showing my thoughts on todays newer products, back to bashing 3dfx, and then finally speculating on the future, i think i will end this. i think im going crazy


    That's a good point.. I am probably heading down the ATI road if I can get one of those 32DDR boards with that rebate

    P.S. Everyone gets pissed off in this forum - It's just the way

    By Magic_Man October 10, 2000, 06:56 PM

    After looking at the review and hoping I did the math right. I find that the Radeon comes in first place if you take out the scores that each card got on the "Bundle" Some of you will say you can't go by that, but to me the bundle isn't a physical performing part of the card. It is only in relation to price. I personally could care less about bundles I am looking for the best all around card and it looks like the Radeon got it. So after I subtract the bundle numbers from each card we have a new winner... Radeon !!!


    Flame away if you care.

    By OOAgentFiruz October 10, 2000, 07:28 PM

    Hey Magic_Man,I agree with you 100% percent Fock* the Bundles : ) How many CD players,Video Players and NeedForSpeedXX do you need?


    *"Cloak of Censorship,A", Chapter 3, pg57.

    By rye222 October 11, 2000, 04:23 AM

    In response to Adisharrs's comment:

    Ok Mr. 1 post.. try visiting here more often if your going to try to tell everyone else what to post and not post. Who do you think you are?
    Hate to break this to you - we post what we want to and as long as it conforms to 'forum use' policies your SOL breadboy

    I would like to stay by telling you that you are right. You have the freedom to post whatever you want as long as you follow 'forum use' guidelines. My intention was not to restrict anyone's freedom of speech (especially your intelligent and heartwarming comments Adissharr).

    Instead, I simply wanted to bring a few things to mind for those “review slammers” that had nothing better to say about the review but negative comments. And since this is a forum, I ALSO have the right to post my comments (within certain guidelines) which I am yet to hear successfully someone challenge. Yes, I did start my original reply with a semi-narrow minded comment by telling GTaudiophile and Ben to “shut the hell up”, and that's the only thing I apologize for. Those were my emotional reactions to their comments speaking, rather than a calm educated disagreement. Also yes, Mr. Adissharr, that was my first time posting a comment in a SHARKY forum, but what does that have to do with anything? I have been a loyal reader of SharkyExtreme's reviews for a LONG time now, I have just never posted anything in their message forums until now. And the only reason I finally decided to post a comment in their forum was because I couldn't believe the un-appreciate and cheap put downs that I was reading toward their review. I mean, was I the only person that actually read the entire review and realized the hard work and great job they did on it? Because the only person that had posted the same thing that I was thinking at the time was Ymaster! Therefore, I felt an serious obligation to speak my voice and defend SE while bringing to the following simple points to these "review slammers" attention:

    No one is forcing you to read Sharky's reviews. If you don't like it, then just ask for a refund of all that money you put toward their service? Oh, but wait, you haven't paid a penny toward all the hard work they have put in for YOUR benefit. Therefore, who are YOU to demand or put down anything they have to say?? That's like charity giving away free food to those that are hungry and you going up to them and complaining that you don't like the way it tastes. LOL- what nerve you must have! If that's the case, then don't eat it (or in this case, don't read it). Furthermore, don't try to discredit them by saying they receive a check from Nvidia (or anyone else for that matter) unless you have evidence to back that up! If you really think they slanted those benchmark results toward NVidia, then why don't you go buy the cards yourself, run your own tests and then you can post FACTUAL evidence on your own site backing your claim. Other than that, you might want to try keeping your negative OPINIONS to yourself as they are only making us other readers look bad and seem unappreciative of the work they put into it.

    BTW, how many other biased review sites have you come across on the Internet? I know that I have found plenty. So why don't you go and complain in their forums? I can tell you why. It's because due to free speech and the wonderful world of disclaimers, they have the right to post whatever they want. Therefore, EVEN IF SharkyExtreme were biased and “paid off” (which I am confident that they are not) they still reserve the right to post whatever they want. And there isn't anything you can do about it except not read it and just look for a different site that DOES have honest and thorough reviews. But at least I have personally always found SE's reviews to be very factual, extremely thorough and 100% un-biased. They simply seem to be looking for the exact same answers that many of us are. Which is the straight up, cut the BS, give me the facts/numbers/benchmarks TRUTH as to what really is the best hardware out right now! And if not the best, what are the features and the lowdown on everything else?

    Anyway, I think I speak on behalf of the average person out there seeking honest reviews when I say that I really appreciate what SharkyExtreme has done here- especially considering the fact that they provide it for free! I can only hope that the few negative comments from the crowd have not discouraged them from continuing to provide the stellar performance they have given so far to their true audience.

    Thanks again and keep up the good work SharkyExtreme! -Rye

    By Ben October 11, 2000, 04:26 AM

    Magic Man,
    You make a good point. If you don't care about game bundles The Radeon or Creative Labs geforce2 are probably your best bet.

    By Ben October 11, 2000, 04:44 AM

    Rye222,
    I'd like to respond to your last post. It seems to me as though your argument is poorly thought out. A number of loyal forum members had a valid concern/complaint about a recent review, so we voiced our opinions. Do you know how boring this site (and the world for that matter) would be if we agreed on everything?

    "That's like charity giving away free food to those that are hungry and you going up to them and complaining that you don't like the way it tastes."

    This has got to be one of the worst analogies i've ever seen. Comparing an internet review site to a charity is wrong, if not plain offensive. I will repeat myself once more, this site is not free! Money is not the only form of payment, especially in this technologically advanced world. The transfer of information is almost, if not more valuable than the transfer of cash. We pay for this site by supporting it and its sponsors, and by recommending it to others. If we weren't here then this site would not exist, therefore I am entitled to respectfully disagree with a review without being verbally attacked. My point is proven by sharky's presence in this thread. He is interested in our questions, and has addressed everyone's valid concerns.

    By Althor October 11, 2000, 07:21 AM

    A lot of posts here say one thing very clear.
    It seems to be pretty difficult, to correctly interpret the results of the tests the guys from Sharky are doing, for at least some people. ( Understand: I don't mean these people are dumbasses, alright? )This, is easily proven by the fact that some of you want Sharky to benchmark with Deus EX and UT. Hellooo, this is pretty stupid, cos these Games use an engine that is almost 2Years Old ( Ok it has gone through some evolution but it's still old) and doesn't really support D3D or OpenGL. This changes a bit when installing UT432 patch, which gets GTS REALLY running for the first time.
    The DEUS Ex engine is on an even earlier level of development (regarding D3D and OpenGL) So you guys cannot be serious, asking to test with a proprietary API that is supported by only 1 (in words: ONE!)Manufacturer.
    I can understand that you prefer your V5 cards. I own a V3 myself, which is a pretty good card. But, hey! Face reality! Gide is about to die. And I'm not unhappy about this.
    I don't like 10000 API's beeing out there, while I'm wondering if my hardware is supported.
    So, please get rid of your rediculous points of view. We need a base to test/use all the cards with.
    And after all: Why do some of you take it so damn personal, when your vidcard is not "best of show" ?

    By Robo October 11, 2000, 10:56 AM

    quote:Originally posted by alexross:
    >Well it's slower than the ATI can Geforce cards by quite a bit. That's it's only real weakness and for that kind of price we had to point that out.

    so go through the new 1.03 drivers, learn what the capabilities are, and use them to their best advantage. With a bit of driver tweeking, I can get over 80 fps in Q3 @ 1024x768x32 using your SHQ settings. Hardly a dog. 70+ fps in MDK2 @ 1024x768 w/2xFSAA, 1024x768x32, AND 1600x1024. Not doggish to me. I also manage to get 50+ fps in UT @ 1024x768 using the UTBench.

    Dog? hardly.

    quote: (posted by Robo)
    2) I find it interesting you conveniently skipped any Unreal Tournament or Deux Ex gaming comparisons. God forbid you do that, the 5500 owns in those games.

    (posted by Sharky)
    >Yes and get into the debate of testing them under Glide vs D3D for the other cards? I don't think that's fair.

    Is it a game? Yes. is it a popular game? yes. Is it a game that has a few "spawns" using the same engine? Yes.
    So test it. The results are very viable, and very applicable to present and future popular games that are shipping/will be shipping.

    as far as glide is concerned:

    MDK2 has T&L. Enable T&L for cards that are T&L capable. Is glide a feature? Yes. Enable glide for cards that are glide capable. Really nothing much to discuss.

    quote:It shows up almost all cards getting 25fps and is dependant upon CPUs. Ask around or try it for yourself. You'll see that a Savage 2000 is right up there with a Voodoo5 in UT if you bench under D3D. Clearly that can't be right. It's the number of BOTS...

    agreed, but in glide, the 5500 gets a good 10-15 fps higher than ANY card out there.

    quote:
    3) I find it interesting that the only D3d game you tested was an old DX6 game that CONVENIENTLY exhibits an issue with the 1.03 drivers, so you had to go back to 1.01, which CONVENIENTLY exhibits an issue with 1600x1200 in D3d.

    >I find it a shame too. I wish there really was something better than that. I have been asking developers for the longest time. I'd like something like NFS Porsche. As far as it being convient, well it wasn't at all. We made 3dfx aware of the problem straight away and they were happy we did so. The reason we chose that resolution was EXACTLY the for the same reason you pointed out. It's an old game. the engine is old. The game doesn't even push new video cards until you get to such a high resolution.
    so do like almost ALL sim'ers do, and enable FSAA, since it is a very viable feature.

    4xFSAA @ 1024 will certainly yield some interesting results, yes?


    quote:For the record, DMZG, 3DMark etc.. are all non starters. They are not games.
    yes, and kudos for that.

    4) I find it interesting you neglected to compare any type of framerates with any racing or flight sims, other than an old-ass DX6 game that is CPU-limited up to 1600x1200x32. yeah, that'll show a big difference.

    quote:>The only one I know of is Falcon 4. I put my hands up and say I know little about flight sims

    fair enough, but since flight and racing sims ARE very popular, you do a disservice by stating that the 5500 is a "dog" , when it plays sim's far bettter than the alternatives.

    quote:I think the sales for Evolva were attrocious but you are right there it is a DX7 game. Actually I would question that is was a game at all and whether or not anyone even knows what it is.
    I've seen it in almost every store. Sales may have sucked (gameplay was weak), but the graphics engine is very good and very advanced, and shows what T&L can do (or more appropriately, DOESN'T)

    quote:We've looked at the NASCAR Heat demo before and have talked about including it. That I agree with.

    FSAA in that game. Remember that.

    quote:Hey, if Revolt wasn't old enough, why not try TombRaider??? WTF, let's go all the way back to Doom!

    >Those games came out in 1995 and 1996. I think that would be a waste of time no?

    sarcasm....


    thx for the response

    By BYTHOR October 11, 2000, 12:42 PM

    quote:Originally posted by Robo:
    sarcasm.....


    thx for the response

    Or why not try using something totaly different like Ultima 9?

    By Iam_Chief_Wiggum October 11, 2000, 12:49 PM

    This thread is too long.
    My card is the best.

    Ha ha.

    btw, who cares?

    By Sharky October 11, 2000, 02:08 PM

    Thanks for taking the time to read and understand where we are coming from.

    quote:Originally posted by GTaudiophile:
    Sharky, sorry for being such a dumbass lately, but we all have our days. Your reviews on the whole are very well done, and I usually agree. If you haven't noticed already, I take brand loyalty to an extreme sometimes. No one can deny that nVidia makes pretty darn good videocards. I just think they are already resting on their laurels.

    By rye222 October 11, 2000, 02:40 PM

    Ben, you said:

    I am entitled to respectfully disagree with a review without being verbally attacked. My point is proven by sharky's presence in this thread.

    You are right and the key words there were respectfully and disagree. However, you seem to have misinterpreted my argument because I was only responding to those they were dishing out cheap put downs and making claims they could not back up in attempts to discredit SE's review. I personally felt that was out-of-line and uncalled for and that's why I said something. Again, don't get me wrong, as I totally agree with respectful and/or valid disagreements but not blatant unappreciative put downs- especially against a service that isn't charging you for the information they provide.

    -Rye

    By Sharky October 11, 2000, 04:15 PM

    quote:

    By Robo October 11, 2000, 07:27 PM

    okay Shark, you seem to be taking my comments quite reasonably and objectively, so let's move right along here:

    /me said "hey, tweek the drivers, find out what the card can do"

    /Sharky said :"Well the point is that we try to have the same settings for each card. It's certainly valid to push each card at their best possible settings but you're not comparing apples to apples then. For all the tweaking I could do with the 3dfx card I could do the same for ATI's and NVIDIAs. That might well be a valid thing to do but in my experience that isn't how the majority of readers use their cards."

    AHA! But here's the thing. When you get a card for YOUR system, don't you go through the drivers and figure out what all the crap means? What does this setting do, what does that setting do? For example, you might be led to think the 5500 has blurry 3d quality, if you didn't know about the lodbias slider, right? Heck, you gotta futz with drivers to enable FSAA or anisotropic (on the GTS and Radeon), right? Is it asking too much to go through the drivers a bit more? Granted, the 5500 has ALOT of stuff you can fiddle with. and I suppose, for a website reviewer, it might be a pain in the ass, but peeps are relying on YOU to give them the scoop. Don't be afraid to do a little extra work. Interesting information is what garners your hits, right?

    Besides, that IS how the VAST majority of peeps use their cards. If you spend $250 on a video card, one might be led to believe you're pretty "hardcore" about your hardware. Goofs like us are the ones that actually KNOW what all that crap in the BIOS means. I mean, who the hell really understands what "memory interleaving" is? Peeps that buy the best hardware, and tweek it to make it better. We KNOW how to get the best performance out of our cards. The problem is, when you make statements like you have made without really delving into the drivers and learning what is going on, you make yourself seem shortsighted.

    /Sharky spoketh: "It is a game yes. The engine is being used by other games yes. But is it a valuable video card benchmark? No. It isn't."

    BAH!!! Answer this: what is more important, finding out how the card benchmarks, or finding out how the card PLAYS THE GAME??

    Correct me if I'm wrong, but the ENTIRE REASON to benchmark a card is to find out HOW IT PLAYS A GAME, right? You obviously have the right frame of mind (your comments about 3dMark are spot on), but if there are several popular games out there using the same engine, doesn't it make sense to find out how well the cards can run that game?

    /Sharky: "Actually the way we do MDK2 is like this: We run it enabled for T&L capable cards and since the Voodoo5 is one of the few that isn't we take that into account and run it without. The scores are thus closer but we do mention that this was achieved without T&L turned on."

    okay, you're one of the smarter sites. But T&L is a feature, just as glide is a feature. Are there ANY GeForce owners out there who are going to play MDK2 WITHOUT T&L enabled? Nope. Are there ANY voodoo card owners out there who are NOT going to play in glide? Nope.

    If you want to compare apples vs. apples, then compare the cards how they will actually BE USED.

    Example: you are trying to find the top speed of a 6-speed Ferrari and a 5-speed Porsche. Do you cripple the Ferrari and only use 5 gears? Of course not. You wont' disable T&L for a GTS, you wouldn't disable Hyper-Z for a Radeon, so why disable glide for the 5500?

    /Sharky: "Therefore it is NOT a good benchmark as I was trying to tell you. I agree with you that it gets 15fps higher than any other card when using GLIDE and if you actually bothered to read the article fully you would have seen that. I go to great lengths to explain that in every single Voodoo5 piece we have ever done."

    right, but when peeps who aren't "in the know" read "it's a dog", and their new favorite computer game is Ut, guess what? They lose out, if they go by what you say.

    /Sharky:"I don't believe FSAA is a very viable feature right now. Even though 3dfx does it better than anyone else, it is not a "FREE" feature. It is too expensive and slows the frame rate down way too much....It's certainly a thing for the future but right now 4X FSAA is too slow in anything above 640x480."

    okay, so what about playing NFS-PU in 1024x768x16 with 4xFSAA? I get almost identical framerates in MDK2 @ 1600x1024 or 1024x768 w/2xFSAA, both in the 70s. Would you consider that unusable?

    /Sharky: "I would chose to play faster and in a higher resolution to compensate a little for the jaggies. So would everyone else I have ever spoken to outside the walls of 3dfx...."

    indeed, I'd MUCH rather use higher resolution with my GTS than FSAA, but that's because FSAA just doesn't look all that good with the GTS by comparison.

    in some cases, Quake3, for example, or UT, higher resolution rules. In many cases, such as just about every single racing and flight sim out there, FSAA makes a TREMENDOUS difference, without affecting framerate all that much.

    Besides, we *were* talking about a racing-sim type of game, and we *were* talking about a game that was CPU-limited up to 1600x1200x32, so why not enable FSAA, to find out just how far you can go before it "is too slow in anything above 640x480"

    /Sharky: "I do not know that for a fact. Yes, in Falcon 4.0 the FSAA looks great when turned on. But as Chris said, , flight sims and racing sims are largely CPU dependent and do not stress the fill-rate of current videocards, but I'm sure you already knew that. "

    indeed, I knew that, but FSAA places an extra burden on the fill-rate of a card, does it not? (along with memory throughput)

    So, why not stress the fill-rate and efficiency of a card by using FSAA in games that are CPU-dependent, without FSAA?

    /Sharky (referencing FSAA): "Yes I will thanks. I am actually aware of it and said this in the article since you obviously didn't read it fully"

    indeed I did, but when you say "it's a dog", expect to hear a bit of dissonance. Fair enough?

    /Sharky: "OK well thanks for reading my responses and as I said yesterday, if indeed you are as I suspect from 3dfx, then please do give us a call."

    I'm not from 3dfx.

    /Sharky: "I do have to question your rather unpleasant comments about Brian Burke. And just for the record, I don't have a girl friend, I have a wife.'

    glad to hear you're married, and I'll apologize for the comment, which was in rather poor taste, especialy in light of what apparently has just occured with BB.

    By Ymaster October 11, 2000, 07:59 PM

    You guys don't think I have the time to read all of that do you? Well I do!

    By Dan1 October 11, 2000, 10:07 PM

    Yeah, well now I feel I need to complain a bit too...everyone else has it's my turn now. I'd like to yell rant scream, what else have people done in this thread, well whatever they've done consider my comment to be the same. Anyways, you were unfair to the Hercules Prophet II, As I found out when mine arrived It's default clock is 220/365 core/mem I really hadn't expected this and it was a wonderful surprise for me. I think you should take into account what the default speed is for each GF2 manufacturer, it might alleviate some of the complaints of 6 identical Nvidia cards. Now I must admit my card is the 64Meg version so maybe they only increased the default speed on that one so maybe I am wrong and if I am I apologize. But if I am wrong then remember what the default clock is for consumers who get the 64MB Prophet card when you do an Ultra-high-end round-up of card with the 64MB VIVO Raedon (also clocked higher than a regular version). That's all, also if the default speed for the 64MB Prophet II isn't 220/365 could someone please tell me why mine was clocked that way out of the box. I was sort of dumbfounded when I started up pstrip on it the first time.


    PS.
    Whoops. My comment didn't quote anything and didn't take up more than 1 screen...I'll try to fix this ASAP, sorry for being reasonable. It's my fault entirely and I'll try to make it longer as soon as I get home.

    Hehe trying to poke fun into this whole overgrown, over-emotional thread.

    Stay loose

    By Sol October 11, 2000, 11:12 PM

    AH, so many words. Here is my take on it, Great Article. Nice and Simple

    By alexross October 11, 2000, 11:38 PM

    okay Shark, you seem to be taking my comments quite reasonably and objectively, so let's move right along here:

    >>>>>Same here so let's go

    AHA! But here's the thing. When you get a card for YOUR system, don't you go through the drivers and figure out what all the crap means?

    >>>That may well be the case for a few folks, yes. But it's hard enough trying to compare apples to apples as it is.

    What does this setting do, what does that setting do? For example, you might be led to think the 5500 has blurry 3d quality, if you didn't know about the lodbias slider, right?

    >>>True. True...

    Heck, you gotta futz with drivers to enable FSAA or anisotropic (on the GTS and Radeon), right? Is it asking too much to go through the drivers a bit more?


    >>>In an ideal world no it isn't but as I was saying this is a roundup and in the single card reviews we do tend to do this.

    Granted, the 5500 has ALOT of stuff you can fiddle with. and I suppose, for a website reviewer, it might be a pain in the ass, but peeps are relying on YOU to give them the scoop. Don't be afraid to do a little extra work. Interesting information is what garners your hits, right?

    >>>Interesting info also garners our interests. I think that there's always a lot of work put into the drivers and I wish half the time we spent working on them was for positive thing instead of trying to fiddle with them so as to DISABLE things like forcing 16-bit textures and the like. That's always a pain.

    Besides, that IS how the VAST majority of peeps use their cards. If you spend $250 on a video card, one might be led to believe you're pretty "hardcore" about your hardware. Goofs like us are the ones that actually KNOW what all that crap in the BIOS means. I mean, who the hell really understands what "memory interleaving" is? Peeps that buy the best hardware, and tweek it to make it better. We KNOW how to get the best performance out of our cards. The problem is, when you make statements like you have made without really delving into the drivers and learning what is going on, you make yourself seem shortsighted.

    >>>Your point of searching through to find something good about the Voodoo5 is getting harder and harder to pin-point. Start with something easy. FSAA looks the best on the Voodoo5. That is clear to all eyes. Delving in drivers in an ideal world is a good thing, yes, and we feel we do enough of it or at least as much as we can. If we had had a section on drivers, and perhaps next time we can add a category for that should people reaquest it, then we wiil.

    BAH!!! Answer this: what is more important, finding out how the card benchmarks, or finding out how the card PLAYS THE GAME??

    >>>Benchmarks are important to OEMs, end users and that is that. Whether or not it is overly important or perhaps falsly, that is how cards are judged today. Of course we spend time playing games on different cards and that is how we get a feel for a card. That's the long term testing and I feel is more important, especially when you end up buying a card. You find things along the way that often do not show up in benchmarks.

    Correct me if I'm wrong, but the ENTIRE REASON to benchmark a card is to find out HOW IT PLAYS A GAME, right? You obviously have the right frame of mind (your comments about 3dMark are spot on)

    >>>Well I am glad we agree on that.

    , but if there are several popular games out there using the same engine, doesn't it make sense to find out how well the cards can run that game?

    >>>I agree on this and keep telling you, that even though it's a "subjective" thing, we have said that the Voodoo5 plays Deus Ex really well compared to the NVIDIA hardware. In addition we have run some tests and done a lot of playing on UT and well we're going around in circles here because you are saying what I've already said in the piece, no?


    okay, you're one of the smarter sites.

    >>>>Nopes. We try our best but don't claim to be smarter

    But T&L is a feature, just as glide is a feature. Are there ANY GeForce owners out there who are going to play MDK2 WITHOUT T&L enabled? Nope.

    >>>Correct. THink apples to apples though instead of pointing out the obvious, which anyone can do really.

    Are there ANY voodoo card owners out there who are NOT going to play in glide? Nope.

    >>>I hope not. That's be a waste.

    If you want to compare apples vs. apples, then compare the cards how they will actually BE USED.

    >>>We do that where possible. Where the playing field changes for cards or features are unsupported, we try to figure things out. We do not like comparing apples to oranges (Joan I wish you'd never coined that term! Apples to apples way back in '97 with TNT vs VooodooG).

    Example: you are trying to find the top speed of a 6-speed Ferrari and a 5-speed Porsche. Do you cripple the Ferrari and only use 5 gears? Of course not. You wont' disable T&L for a GTS, you wouldn't disable Hyper-Z for a Radeon, so why disable glide for the 5500?

    >>>Correct and these are obvious things that we don't need to waste people's time with. A simple sentence or two explaining that, which we did do, solves that no?

    right, but when peeps who aren't "in the know" read "it's a dog", and their new favorite computer game is Ut, guess what? They lose out, if they go by what you say.

    >>>>Only, if like you, they get stuck on one word and don't fully read the article. I know that that one word seems to bug you a lot but there are 45 others way more positive in that piece.

    okay, so what about playing NFS-PU in 1024x768x16 with 4xFSAA?

    >>>>Tried it. Too slow. Or at least slower and to me less impressive than playing it at 1600x1200x32 with no FSAA. Obviously it isn't really needed there.

    I get almost identical framerates in MDK2 @ 1600x1024 or 1024x768 w/2xFSAA, both in the 70s. Would you consider that unusable?

    >>>Not unusuable but still on the slow side. 1600x1200 is still my choice. And that's a subjective thing in many ways and I've said that.

    indeed, I'd MUCH rather use higher resolution with my GTS than FSAA, but that's because FSAA just doesn't look all that good with the GTS by comparison.

    >>>The GTS' FSAA isn't as crisp and indeed slower. But as I said 1600x1200 is faster and better looking thant 800x600 or 1024x768 (which is slow) on the Voodoo5.

    in some cases, Quake3, for example, or UT, higher resolution rules. In many cases, such as just about every single racing and flight sim out there, FSAA makes a TREMENDOUS difference, without affecting framerate all that much.

    >>>I disagree that it doesn't make a difference. You talked about using FRAPS.. well when I have done so I do see a huge drop. Porsche Unleashed is too slow at 1024x768x32 with 4xFSAA and not worth it at 2x. 1600x1200x32 on the Voodoo5 with no FSAA is just slower than the GeForce 2 at 1600x1200x32. Simple as that. If you have both cards which you say you do just try it for yourself. I can tell you until you're blue in the face but really you can only believe it when you do it for yourself. It's just slower. Period.

    Besides, we *were* talking about a racing-sim type of game, and we *were* talking about a game that was CPU-limited up to 1600x1200x32, so why not enable FSAA, to find out just how far you can go before it "is too slow in anything above 640x480"

    >>>Shrug... let's agree to disagree on this one. Or at least try these things out for yourself.

    indeed, I knew that, but FSAA places an extra burden on the fill-rate of a card, does it not? (along with memory throughput)

    >>>>Yes indeed.

    So, why not stress the fill-rate and efficiency of a card by using FSAA in games that are CPU-dependent, without FSAA?

    indeed I did, but when you say "it's a dog", expect to hear a bit of dissonance. Fair enough?

    >>>I can expect owners that are very emotional about video cards to react with emotion and/or employees from different companies being upset. I understand that working on a chip for two years becomes your 'baby' but please undestand this- the readers are "my baby". I don't mince words, neither does anyone at SE. Our readers expect the truth and we want to take care of them. We don't want to take care of companies or even get concerned about what they think, unless of course there are technical innacuracies, again because it effects my readers. I do this for the readers sake and they come first. Always have and always will.

    I'm not from 3dfx.

    >>>OK well I did talk to 3dfx yesterday and they did read this post and well.. never mind. I'll take your word for it.

    /Sharky: "I do have to question your rather unpleasant comments about Brian Burke. And just for the record, I don't have a girl friend, I have a wife.'

    glad to hear you're married, and I'll apologize for the comment, which was in rather poor taste, especialy in light of what apparently has just occured with BB.

    >>>I appreciate your appology and accept it. I think you do much better discussing technology when you don't get emotional as per yesterday. It's more enjoyable for everyone this way I think. Thanks for reading. Pardon the typos.. I am in the lab benchmarking and typing on a crap laptop keyboard

    [/B][/QUOTE]

    By alexross October 11, 2000, 11:41 PM

    Just a quick note to everyone who has participated in this discussion.

    Thanks to everyone for the comments and feedback. Good and bad. It was a useful excersize. I appreciate how emotianal things can get so please remember that we're dealing with graphics cards and not something more important

    Cheers guys.

    By rye222 October 12, 2000, 12:01 AM

    Wow, I can't believe this single post has generated 73 replies so far! (Futhermore, I can't believe I have read them all). BTW Shark, what is the record so far for the most replies in a single post on these forums? I'm sure it's probably like 200 or somethin....

    By Ymaster October 12, 2000, 12:34 AM

    quote:Originally posted by alexross:
    okay Shark, you seem to be taking my comments quite reasonably and objectively, so let's move right along here:

    >>>>>Same here so let's go

    AHA! But here's the thing. When you get a card for YOUR system, don't you go through the drivers and figure out what all the crap means?

    >>>That may well be the case for a few folks, yes. But it's hard enough trying to compare apples to apples as it is.

    What does this setting do, what does that setting do? For example, you might be led to think the 5500 has blurry 3d quality, if you didn't know about the lodbias slider, right?

    >>>True. True...

    Heck, you gotta futz with drivers to enable FSAA or anisotropic (on the GTS and Radeon), right? Is it asking too much to go through the drivers a bit more?


    >>>In an ideal world no it isn't but as I was saying this is a roundup and in the single card reviews we do tend to do this.

    Granted, the 5500 has ALOT of stuff you can fiddle with. and I suppose, for a website reviewer, it might be a pain in the ass, but peeps are relying on YOU to give them the scoop. Don't be afraid to do a little extra work. Interesting information is what garners your hits, right?

    >>>Interesting info also garners our interests. I think that there's always a lot of work put into the drivers and I wish half the time we spent working on them was for positive thing instead of trying to fiddle with them so as to DISABLE things like forcing 16-bit textures and the like. That's always a pain.

    Besides, that IS how the VAST majority of peeps use their cards. If you spend $250 on a video card, one might be led to believe you're pretty "hardcore" about your hardware. Goofs like us are the ones that actually KNOW what all that crap in the BIOS means. I mean, who the hell really understands what "memory interleaving" is? Peeps that buy the best hardware, and tweek it to make it better. We KNOW how to get the best performance out of our cards. The problem is, when you make statements like you have made without really delving into the drivers and learning what is going on, you make yourself seem shortsighted.

    >>>Your point of searching through to find something good about the Voodoo5 is getting harder and harder to pin-point. Start with something easy. FSAA looks the best on the Voodoo5. That is clear to all eyes. Delving in drivers in an ideal world is a good thing, yes, and we feel we do enough of it or at least as much as we can. If we had had a section on drivers, and perhaps next time we can add a category for that should people reaquest it, then we wiil.

    BAH!!! Answer this: what is more important, finding out how the card benchmarks, or finding out how the card PLAYS THE GAME??

    >>>Benchmarks are important to OEMs, end users and that is that. Whether or not it is overly important or perhaps falsly, that is how cards are judged today. Of course we spend time playing games on different cards and that is how we get a feel for a card. That's the long term testing and I feel is more important, especially when you end up buying a card. You find things along the way that often do not show up in benchmarks.

    Correct me if I'm wrong, but the ENTIRE REASON to benchmark a card is to find out HOW IT PLAYS A GAME, right? You obviously have the right frame of mind (your comments about 3dMark are spot on)

    >>>Well I am glad we agree on that.

    , but if there are several popular games out there using the same engine, doesn't it make sense to find out how well the cards can run that game?

    >>>I agree on this and keep telling you, that even though it's a "subjective" thing, we have said that the Voodoo5 plays Deus Ex really well compared to the NVIDIA hardware. In addition we have run some tests and done a lot of playing on UT and well we're going around in circles here because you are saying what I've already said in the piece, no?


    okay, you're one of the smarter sites.

    >>>>Nopes. We try our best but don't claim to be smarter

    But T&L is a feature, just as glide is a feature. Are there ANY GeForce owners out there who are going to play MDK2 WITHOUT T&L enabled? Nope.

    >>>Correct. THink apples to apples though instead of pointing out the obvious, which anyone can do really.

    Are there ANY voodoo card owners out there who are NOT going to play in glide? Nope.

    >>>I hope not. That's be a waste.

    If you want to compare apples vs. apples, then compare the cards how they will actually BE USED.

    >>>We do that where possible. Where the playing field changes for cards or features are unsupported, we try to figure things out. We do not like comparing apples to oranges (Joan I wish you'd never coined that term! Apples to apples way back in '97 with TNT vs VooodooG).

    Example: you are trying to find the top speed of a 6-speed Ferrari and a 5-speed Porsche. Do you cripple the Ferrari and only use 5 gears? Of course not. You wont' disable T&L for a GTS, you wouldn't disable Hyper-Z for a Radeon, so why disable glide for the 5500?

    >>>Correct and these are obvious things that we don't need to waste people's time with. A simple sentence or two explaining that, which we did do, solves that no?

    right, but when peeps who aren't "in the know" read "it's a dog", and their new favorite computer game is Ut, guess what? They lose out, if they go by what you say.

    >>>>Only, if like you, they get stuck on one word and don't fully read the article. I know that that one word seems to bug you a lot but there are 45 others way more positive in that piece.

    okay, so what about playing NFS-PU in 1024x768x16 with 4xFSAA?

    >>>>Tried it. Too slow. Or at least slower and to me less impressive than playing it at 1600x1200x32 with no FSAA. Obviously it isn't really needed there.

    I get almost identical framerates in MDK2 @ 1600x1024 or 1024x768 w/2xFSAA, both in the 70s. Would you consider that unusable?

    >>>Not unusuable but still on the slow side. 1600x1200 is still my choice. And that's a subjective thing in many ways and I've said that.

    indeed, I'd MUCH rather use higher resolution with my GTS than FSAA, but that's because FSAA just doesn't look all that good with the GTS by comparison.

    >>>The GTS' FSAA isn't as crisp and indeed slower. But as I said 1600x1200 is faster and better looking thant 800x600 or 1024x768 (which is slow) on the Voodoo5.

    in some cases, Quake3, for example, or UT, higher resolution rules. In many cases, such as just about every single racing and flight sim out there, FSAA makes a TREMENDOUS difference, without affecting framerate all that much.

    >>>I disagree that it doesn't make a difference. You talked about using FRAPS.. well when I have done so I do see a huge drop. Porsche Unleashed is too slow at 1024x768x32 with 4xFSAA and not worth it at 2x. 1600x1200x32 on the Voodoo5 with no FSAA is just slower than the GeForce 2 at 1600x1200x32. Simple as that. If you have both cards which you say you do just try it for yourself. I can tell you until you're blue in the face but really you can only believe it when you do it for yourself. It's just slower. Period.

    Besides, we *were* talking about a racing-sim type of game, and we *were* talking about a game that was CPU-limited up to 1600x1200x32, so why not enable FSAA, to find out just how far you can go before it "is too slow in anything above 640x480"

    >>>Shrug... let's agree to disagree on this one. Or at least try these things out for yourself.

    indeed, I knew that, but FSAA places an extra burden on the fill-rate of a card, does it not? (along with memory throughput)

    >>>>Yes indeed.

    So, why not stress the fill-rate and efficiency of a card by using FSAA in games that are CPU-dependent, without FSAA?

    indeed I did, but when you say "it's a dog", expect to hear a bit of dissonance. Fair enough?

    >>>I can expect owners that are very emotional about video cards to react with emotion and/or employees from different companies being upset. I understand that working on a chip for two years becomes your 'baby' but please undestand this- the readers are "my baby". I don't mince words, neither does anyone at SE. Our readers expect the truth and we want to take care of them. We don't want to take care of companies or even get concerned about what they think, unless of course there are technical innacuracies, again because it effects my readers. I do this for the readers sake and they come first. Always have and always will.

    I'm not from 3dfx.

    >>>OK well I did talk to 3dfx yesterday and they did read this post and well.. never mind. I'll take your word for it.

    /Sharky: "I do have to question your rather unpleasant comments about Brian Burke. And just for the record, I don't have a girl friend, I have a wife.'

    glad to hear you're married, and I'll apologize for the comment, which was in rather poor taste, especialy in light of what apparently has just occured with BB.

    >>>I appreciate your appology and accept it. I think you do much better discussing technology when you don't get emotional as per yesterday. It's more enjoyable for everyone this way I think. Thanks for reading. Pardon the typos.. I am in the lab benchmarking and typing on a crap laptop keyboard

    [/B][/QUOTE]

    I'm not going to even start with this one...way too long!

    By alexross October 12, 2000, 12:36 AM

    Cheers very much Sol.

    quote:Originally posted by Sol:
    AH, so many words. Here is my take on it, Great Article. Nice and Simple

    By alexross October 12, 2000, 12:37 AM

    heya rye,

    Not sure what the record is but this one might break it I reckon. It's an emotional thread, that's for sure.

    quote:Originally posted by rye222:
    Wow, I can't believe this single post has generated 73 replies so far! (Futhermore, I can't believe I have read them all). BTW Shark, what is the record so far for the most replies in a single post on these forums? I'm sure it's probably like 200 or somethin....

    By Ymaster October 12, 2000, 12:38 AM

    quote:Originally posted by alexross:
    Cheers very much Sol.


    I love small posts...

    By mlusr October 12, 2000, 12:46 AM

    quote:Originally posted by Ymaster:

    I love small posts...

    Same here
    One more thing. I think SE has done great job here and I'm sure they keep doing a lot more.

    By the_writer921 October 12, 2000, 01:11 AM

    mine's at 92 posts (what was your first computer in CPUs/overclocking) and "what do you have under *your* hood" has over 105 and that was a while ago.
    other than that i am glad people are showing a lot more maturity than the first few posts...i'm glad that people are cooling down and rationally discussing things, instead of the accusations and whatnot...good work guys
    -the_writer921

    By Robo October 12, 2000, 01:21 AM

    well, I've had a 32MB GTS, a 64MB GTS, and 2 different 5500's. Of course, all my "comparisons" are taken on an overclocked P3 system with all cards overclocked (both 5500's hit 190, the 64MB GTS was at 400 memory and the 32 @ 365)

    I thought all the cards kicked ass. Guess I got hung up on the "5500 is a dog", when it does a mighty fine job. Of course, I was one of the many cursed with Detonator 3 compatibility issues, and I also hated Q3-TC on it, so I disabled it, so the immense speed "granted" to the GTS with TC using 6.18 just didn't cross my screen.

    ESPECIALLY NFS-PU. I'm damn lucky I could get through a single darn race with the GTS using 5.32 (an issue that is presently addressed in the GeForce FAQ)

    WRT FSAA - Sharky, we'll agree to disagree here. Up the clock speed of the 5500 to 183 and you'll see why. Of course, the counter is "well, why not measure them at default", and from a reviewer's point of view, you can't go overclocking all your hardware for comparisons.

    With that in mind, I'll bow out of this thread, with some dignity intact.

    good discussion. I still think you're wrong, but that's okay. Nobody's perfect. :P

    I'm quite curious about the remark you made about "speaking to a 3dfx employee" tho. You can check my email, it's the real deal. Just a normal idiot on the west coast.

    By chrisangelini October 12, 2000, 03:43 AM

    Robo,

    I've actually yet to see a Voodoo5 that would reliably hit 183MHz. I'm sure they exist, but we have topped out at around 173 or 74 with all of ours.

    Ciao!

    By alexross October 12, 2000, 04:33 AM

    Same here Thanks for the kind words. I think it's settled down a bit!

    quote:Originally posted by the_writer921:
    mine's at 92 posts (what was your first computer in CPUs/overclocking) and "what do you have under *your* hood" has over 105 and that was a while ago.
    other than that i am glad people are showing a lot more maturity than the first few posts...i'm glad that people are cooling down and rationally discussing things, instead of the accusations and whatnot...good work guys
    -the_writer921

    By trane77 October 12, 2000, 05:13 AM

    quote:Ok, I can't really see the argument that this is an unfair review, yes their were 6 geforce2 cards in the race, but this was an article about all current 3d cards correct? It was designed to help the consumer pick a card, that offered both value and performance. Now perhaps if the geforce 2 cards were catagorized into a seperate catagory and given awards in their respective catagory, and using an average for all of the geforce cards, vs then ata and 3dfx cards, perhaps that could solve the problem. But that's not exactly what the article was deisnged to do, it was designed to give a better insight on which card would be better to purchase. Your not going to go car shopping and just look at ford, chevy or dodge, are you? and say that you'll buy a car based on that. You must look deeper and at the specific model # etc, (yeah it's a bad anylogy and spelling ) but it's 5am so i don't care. anyway, point being I think a lot of you might have misunderstood this article, perhaps im wrong. but either way this article was exactly what I was looking for to help me make a decision

    By Althor October 12, 2000, 05:52 AM

    Nice to see, that everybody calmed down a little. Makes discussions, much easier. There's no use in offending each other (except the fact that the number of posts increases :-))
    Let's keep it like that.

    And thanks to the Sharky team (especially Alex) for really facing this discussion.
    That prooved that you guys care about what you write. !!! Keep up the good work.

    By Robo October 12, 2000, 07:50 AM

    quote:Originally posted by chrisangelini:
    Robo,

    I've actually yet to see a Voodoo5 that would reliably hit 183MHz. I'm sure they exist, but we have topped out at around 173 or 74 with all of ours.

    Ciao!

    !!!?!?!?!?!?!?!

    you're kidding? 173??? that's it? hell, no wonder you think the thing is a dog.

    soooo....have I gotten lucky or did you get a few POS's or what? Bought the first one @ Electronics Boutique on their "10-day 'borrow and test'" mode, and returned it (cuz I got my 64MB GTS in the mail), then bought the other 5500 online @ E-Bay.

    I started a thread a few weeks ago on the 5500 newsgroup asking about overclocks, and everyone I saw, save for one, hit 183+.

    ah...the wonderful world of overclocking...

    just outta curiosity, what mobo were you using? I/O voltage?


    P.S. Kudos to the entire Sharky team for coming into these forums with a friendly asbestos suit on and dealing with the unruly natives. Quite classy, IMHO. Good show.

    By chrisangelini October 12, 2000, 02:06 PM

    Robo,

    I've tested three myself, and they've likely tried two or three in San Jose. I've tried with the CUSL2 at default I/O and also 3.65V. Not sure if it was the motherboard or just the early boards that didn't overclock as well, but I wasn't expecting them to go through the roof anyways. Just curious, what's the highest frequency you've heard as stable for the Voodoo5?

    Ciao!

    By mpitts October 12, 2000, 02:31 PM

    quote:Originally posted by Ymaster:

    I love small posts...

    Me too

    By Jeff Golds October 12, 2000, 02:36 PM

    quote:Originally posted by mpitts:
    Me too

    Ditto.

    By Robo October 12, 2000, 02:48 PM

    Chris, from what I've seen, the highest frequency I've seen with just basic modifcations (i.e. a few RAMsinks and/or replacing the main fans) is 198. I know of several peeps who have hit in the low 190s, and all, save for 1, hit 183.

    Of course, all that is assuming they weren't lying.

    I've hit 190 with both of mine, perfectly stable on an MSI BX Master @ 3.5v I/O on 133 MHz FSB's.

    I wonder if you've had "first run" boards? i.e. boards that were among the first released? They had "issues" with some drunk dude doing a horrid glue-job with those Aavid HSF units.

    Clumped, crooked, lame glue jobs seemed to be a major cause of "non-overclockability" in the 5500's both for FSB o/c'ing and core/mem o/c'ing. When those peeps yanked the HSF's off and reapplied them with an appropriate amount of compound, they were able to hit 89 mhz AGP and 183+ overclocks, when previously they weren't able to.

    Again, that's just from what peeps have sprouted at the various forums. they could've been lying. I know I'm not tho.

    By memphist0 October 12, 2000, 04:56 PM

    Hey Chris,
    I have a V5 that does 181 on a CUSL2 default voltage, room temp around 80. I have been able to go as high as 185 but that is when the room temp is closer to 70.
    Robo I wish I could reach 190+

    I do agree with Robo about the review seeming to be biased. On the scorecard near the end I though you hit the V5 a little harder than the others for bundle and ease of use. Also, based on what you said in this thread FSAA/quality should be a ten since it is FSAA/quality not speed that you seemed to have a problem with. The strangest scores where image quality based on my own experience with a Prophet II and from what I heard from others the V5 kicks the GTS's butt especially in games like Q3A with the TC causing a smearing or blending of colors. I've also seen a few reviews that talked about this.

    It was because of reviews like these that I even played with the Prophet II and if I hadn't been able to return it I would have been disappointed with what my $300(it was a month or so ago) bought me.

    By Simonac October 12, 2000, 06:35 PM

    To all frustrated people who complain b'cause someone didnt like the article, not liking ONE article doesnt mean one doesnt love the website.
    show some maturity and rationnality.

    By GTaudiophile October 12, 2000, 10:26 PM

    Come on...let's hit 100 people!!!

    By Althor October 13, 2000, 03:04 AM

    It's absolutely O.K. and legal to overclock, but in terms of a review, you have to take the default settings, most people use ( I mean most people "out there" not here in the sharky forums where we tech freak idiots roam ) ;-)
    So overclockabitity could be one thing you mention in the review but it should not change the score it gets.
    On the other hand you could make a dedicated "Hardcore Overclocking Max. Heat Review" ( I would like that one )


    By yngwie98 October 13, 2000, 03:17 AM

    I'm quite happy with my Radeon 32Mb DDR which will wind up costing only $150. Q3 is much more playable and looks worlds better than it did on my old Xpert 98. Now I need to upgrade my gaming skills!

    By rye222 October 13, 2000, 04:00 AM

    I'm actually looking for the best 2D/3D card with GOOD TV-OUT but still value conscious- no more than $200!!! Any suggestions??

    (By the way, we're almost at 100!)

    By OOAgentFiruz October 13, 2000, 04:05 AM

    An Old AIW all in wonder maybe.Goodluck rye222.


    Contact Us | www.SharkyForums.com

    Copyright © 1999, 2000 internet.com Corporation. All Rights Reserved.


    Ultimate Bulletin Board 5.46

    previous page
    next page





    Copyright © 2002 INT Media Group, Incorporated. All Rights Reserved. About INT Media Group | Press Releases | Privacy Policy | Career Opportunities