Home

News

Forums

Hardware

CPUs

Mainboards

Video

Guides

CPU Prices

Memory Prices

Shop



Sharky Extreme :


Latest News


- Acer Fires Up Two New Ferrari Notebooks
- Belkin Debuts Docking Station for ExpressCard-Equipped Notebooks
- Logitech 5.1 Speaker System Puts Your Ears At Eye Level
- Dell, Nvidia, and Intel Fire Up Overclockable Gaming Notebook
- Gateway Puts Premium Components Into Affordable Home Desktop
News Archives

Features

- SharkyExtreme.com: Interview with ATI's Terry Makedon
- SharkyExtreme.com: Interview with Seagate's Joni Clark
- Half-Life 2 Review
- DOOM 3 Review
- Unreal Tournament 2004 Review

Buyer's Guides

- September High-end Gaming PC Buyer's Guide
- September Value Gaming PC Buyer's Guide
- October Extreme Gaming PC Buyer's Guide

HARDWARE

  • CPUs


  • Motherboards

    - DFI LANPARTY UT nF4 Ultra-D Motherboard Review

  • Video Cards

    - Gigabyte GeForce 7600 GT 256MB Review
    - ASUS EN7900GT TOP 256MB Review
    - ASUS EN7600GT Silent 256MB Review
    - Biostar GeForce 7900 GT 256MB Review





  • SharkyForums.Com - Print: Now that DX8 is here...

    Now that DX8 is here...
    By Azuth November 10, 2000, 10:49 AM

    Any bets on how well the Radeon will compare to the competition?

    I know all the companies will have some type of press release about DX8, but here's one about ATI.
    http://www.newswire.ca/releases/November2000/09/c3144.html

    By FreakyDeak November 10, 2000, 11:06 AM

    I think the GTS 2 (Ultra is the card to get if your Bill frickin gates!! ;-)However Im currnetly knee deep in video card & the leader seems to be with GEFORCE GTS 2.
    But I have a Duron 800 System with the Abit KT7 Raid & I havent seen enuff data yet saying its ok or not with the driver problems in Nvidia. For the GTS Pro i found it shipping for $362 but thats still alittle outaa reach.... So Im leaning towards theGeforce GTS 2 64 mb, 350 banger, & pixels coming outta the whazzoo! Or am i now too loopy to get any of these stats right????...
    hmmm...Thanks for the venting sessions,,,, To come & gone...!
    URL=http://www.newswire.ca/releases/November2000/09/c3144.html]http://www.newswire.ca/releases/November2000/09/c3144.html[/URL] [/B][/QUOTE]

    By Angelus November 10, 2000, 11:09 AM

    Couple of things that should/could happen:

    - Every hardware site out there should do their graphics card tests again. If what ATI claims is true, then the Radeon should be on top of the stack. Of course, I don't think it's gonna hurt the Nvidia cards to have DirectX 8 installed so I'm really looking forward to some comparison tests now.

    - The forum here is gonna get swamped with people having problems with the new version because it totally screwed up their system
    Now this is something I hope DOESN'T happen (yeah I know, I'm to kind).

    - ATI and Nvidia lovers can start all over against each other about how great ATI is now that DirectX8 is out and how Nvidia sucks for not having such a card yet. People, please stay nice to each other and smile like so .

    May DirectX 8 be with you!

    By Captain Iglo November 10, 2000, 11:12 AM

    Excuse me,...
    but don't u need an application BEFORE u can see the advantages of DX8???

    i mean, as long as the game does not use Radeon's 3rd texture pipeline (and other features), we won't see B I G differences!

    By Azuth November 10, 2000, 11:25 AM

    Right, we won't see any difference until there is an app that takes advantage of DX8's new features.

    I'm sure there will be at least a demo app or a benchmarking app that will do this pretty quickly. Hopefully soon because I wanna know if my radeon was a great buy instead of just a good buy.

    By Heffe November 10, 2000, 11:58 AM

    I'd just love to have the computer AND the money to buy one, that's for shure!

    By Narchwoogle November 10, 2000, 02:09 PM

    Sure, there will be some benchmarks or something out there so that all the Radeon owners can justify their cards. However, there won't be any real games that people actually want to play supporting the new features in DirectX 8 for a while. By then, the Radeon will be obsolete and all the people that bought it for it's DirectX features will be scratching their heads. The cards will finally use the nifty features, but they'll have slow performance compared to newer cards forcing them to buy new cards anyway. These people will in turn buy the new model ATI cards that offer inferior performance and buggy drivers because of DirectX 9 features they are supposed to use one distant day in the future. They'll suffer with their buggy drivers and slow performance screaming vengance is mine some distant day in the future when the card itself has grown obsolete.

    By BuggyLoop November 10, 2000, 02:16 PM

    I give 4 months before we see a kickass game using Dx8 , dont worry, games evolve fast.

    By Captain Iglo November 10, 2000, 02:24 PM

    Sorry, but i have to disagree:

    for people like me, who still have their old PII 400 and a TNT2, the situation looks the following way:
    i buy a complete rig every two years or so, keep upgrades cheap and rare, and use it until it really sucks, performance wise!

    so, if i had an application which does not run acceptably, and two years are well over, i get a new one.
    i get the best card available if it looks promising (V2 that day - 1998) or get an average one which should last for one year or so.
    the only reason i have a TNT2 now is that i needed the additional PCI slot for a network card, and a V2 requires a primary vid card - consuming 2 slots.

    today, Radeon seems the most future proof card, not the fastest, of course!
    the reasons are as follows:
    1) partial DX8 set - not complete, but GF's DX7 set isn't complete either!
    2)partial Hidden Surface Removal (hierachical Z), overdraw imposes severe VIS constraints - blocking walls and so on, not to overload graphics, without them games would look much more natural - overdraw is SURELY going to increase.

    so, in 1 year or even 2, GeForce cards will be slide show performers due to lots of hidden surfaces.

    of course, if u have the cash, u can get every new generation of cards.

    but there's a lot of time to come: HalfLife (today my fav.) runs smooth @ 1024*768, so i see no reason to upgrade yet!

    By cracKrock November 10, 2000, 02:32 PM

    Thanks to DX8, the Radeon will rock harder than it already does. If ATI can keep up the good work on their drivers, they should be able to tweak a little more speed there, too.


    ...
    check out: http://www.tranzor.net

    By Azuth November 10, 2000, 02:38 PM

    I was just about to post a similar response as Cap'n Iglo's. If you are the kind of guy that upgrades their video card or entire system every year, then the GForce is the logical choice. I just upgraded from an old TNT. Even the TNT worked great with 99% of new games.

    My roomate is the kind of guy that upgrade's every year. I upgrade every 2 years or more. My system still stays very close to his in performance. The only difference is that I have about another $1000 bucks to spend every year. We run the same games and I still win the ones I'm good at.

    I wait till my system starts to run at an unaceptable level before I upgrade. My old C400A is starting to show its age, but there's not a game out that won't run just fine on it. For me, it makes sense to hold out till the 760 motherboards and DDR RAM become available(like after the initial hype dies down).

    I'd imagine the Radeon I just got about 2 weeks ago will last me about 2 years. If the games start to use it in 6 months then it was a smart buy.

    What I'm saying is that you can either spend the cash and buy the best, or you can spend wisely and plan for the future. The same can be said for all aspects of life.

    By Narchwoogle November 10, 2000, 02:52 PM

    In the future, most games will run better with the NVIDIA Geforce 2 GTS than with the ATI Radeon. Here are some reasons why.
    1. There are a lot more Geforce 2 cards out there than Radeons. Most developers are going to use Geforce2 cards as their primary development platforms. This will makes games run faster with the Geforce2.
    2. Half the games that are comming out now days use OpenGL. Geforce is better in OpenGL.
    3. The DirectX 8 features of the Radeon are not game proven in the real world. For all you know ATI may have done a poor job implementing them. In the end, using them on the Radeon may actually slow down the game significantly and will end up being disabled for the most part when using the Radeon.
    4. The deveopers have to actually write the game to use the new DirectX 8 feature. Think about how long the T&L engine has been out and yet the number of games that use it are not that many.
    5. NVIDIA may be able to implement some of the new DirectX 8 features into their drivers for the Geforce2. Simular to how they did FSAA. Even though the features will be a software implementation, they will still work and might even end up being faster than the Radeon.

    By GTaudiophile November 10, 2000, 03:33 PM

    I recently upgraded to an ATI Radeon 32DDR from a Creative TNT2-Ultra 32SDR. Compared to the Radeon 32DDR with FSAA enabled at 1024x768 in Counter Strike, 3D image quality with the TNT2-Ultra is like playing with foggy sunglasses on! The colors are so much sharper and clearer, and I have far fewer visual artifacts with the Radeon than I did with my TNT2-Ultra. My TV tuner even works better with the Radeon than it did with my TNT2U! Thumbs up to ATI and thumbs down to nVidia for me!

    Speed means jack! It's all about the image quality! Besides, in games like CS, you can only run so fast anyway. What about DVD quality? The GF2GTS can't touch the Radeon! What about price? With the rebate, the Radeon 32DDR costs some $100 less than the average GF2GTS! nVidia is getting lazy and arrogant as did Intel and Micro$oft. It's about time another company came along and offered a card that was 90% as fast with better image and DVD quality and better value.

    BTW, the ATI Radeon CAN run everything it is suppose to run. To prove this, ATI included the Ark demo, something a GFS2GTS couldn't dream of running.

    By cracKrock November 10, 2000, 03:47 PM

    Dude, why are you making these broad, false claims like, "Geforce is better in OpenGL" and "In the end, using them on the Radeon may actually slow down the game significantly and will end up being disabled for the most part when using the Radeon"???

    Do you have any *proof* of any of that or do you just like to bash the Radeon?

    Like I said, the Radeon supports every feature the GeForce2 supports AND THEN SOME.

    FACT: Games are getting visually more complex. If you have an object that requires more than 2 textures to be rendered, the GeForce will lose ground quickly to the Radeon's 3 texture mapping capabilities.

    FACT: The Radeon, though clocked SLOWER than the GeForce, has higher FPS than the GeForce at high-resolution 32-bit gaming. This goes to show you that ATI has a better engineered product. Rather than just throwing more DDR memory at the problem, ATI has actually found a way to save bandwidth.

    FACT: Just because a game is designed to run on a particular card doesn't mean the game won't run faster on another card. Do you think that a game designed on a S3 ViRGE card wouldn't run faster on a GeForce?

    Please, don't post that unsubstantiated nonsense anymore.

    quote:Originally posted by Narchwoogle:
    In the future, most games will run better with the NVIDIA Geforce 2 GTS than with the ATI Radeon. Here are some reasons why.
    1. There are a lot more Geforce 2 cards out there than Radeons. Most developers are going to use Geforce2 cards as their primary development platforms. This will makes games run faster with the Geforce2.
    2. Half the games that are comming out now days use OpenGL. Geforce is better in OpenGL.
    3. The DirectX 8 features of the Radeon are not game proven in the real world. For all you know ATI may have done a poor job implementing them. In the end, using them on the Radeon may actually slow down the game significantly and will end up being disabled for the most part when using the Radeon.
    4. The deveopers have to actually write the game to use the new DirectX 8 feature. Think about how long the T&L engine has been out and yet the number of games that use it are not that many.
    5. NVIDIA may be able to implement some of the new DirectX 8 features into their drivers for the Geforce2. Simular to how they did FSAA. Even though the features will be a software implementation, they will still work and might even end up being faster than the Radeon.

    By Mighty Mighty Me November 10, 2000, 04:11 PM

    Ponitless speculation. Everyone forgets there are plenty of Direct X7.0a features that STILL aren't being used, The Radeon has more support for some of the Direct X 8.0 features not all of them. Point being the Radeon is a great card, that has alot of features, but its probably not going run faster now because Direct X 8.0 is out. It just will have the ability to use more features of Direct X 8.0, if the programmers even use all those features. I mean hell, Voodoo's are "optimized" for Direct X 8.0, and we all know how many features that POS has. To sum it all up, basically the Radeon is really not going to be the fastest card now because Direct X-8.0 is out now, its just going to have the option to do more.

    By Azuth November 10, 2000, 04:41 PM

    Just out of curiosity, I have a question for the video card gurus that frequent this board. I remember reading somewhere(Tom's HW I think it was) that the GeForce does not have hardware environment bumpmapping support and that the Radeon does bumpmapping like 10,000 times faster than the GeForce. Is that true?

    I'm asking because bumpmapping is one feature that I KNOW we will see become standard in most new games. If I remember correctly, that is one of the new features of DX8 also. Does anyone know more about this?

    By BuggyLoop November 10, 2000, 04:50 PM

    Yea azuth, and i think that feature is already included in DX7, insane, the game, already use that feature if i remember (cool game)

    By Emilee November 10, 2000, 04:57 PM

    quote:Originally posted by Azuth:
    Right, we won't see any difference until there is an app that takes advantage of DX8's new features.

    I'm sure there will be at least a demo app or a benchmarking app that will do this pretty quickly. Hopefully soon because I wanna know if my radeon was a great buy instead of just a good buy.

    Thats correct. But if you remeber when T&L came out, there were a ton of demos that nVidia released that made everyone go "ohhh ahh". Companies do the same tricks.

    By The Grinch November 10, 2000, 05:25 PM

    quote:Originally posted by Narchwoogle:
    In the future, most games will run better with the NVIDIA Geforce 2 GTS than with the ATI Radeon. Here are some reasons why.
    1. There are a lot more Geforce 2 cards out there than Radeons. Most developers are going to use Geforce2 cards as their primary development platforms. This will makes games run faster with the Geforce2.
    2. Half the games that are comming out now days use OpenGL. Geforce is better in OpenGL.
    3. The DirectX 8 features of the Radeon are not game proven in the real world. For all you know ATI may have done a poor job implementing them. In the end, using them on the Radeon may actually slow down the game significantly and will end up being disabled for the most part when using the Radeon.
    4. The deveopers have to actually write the game to use the new DirectX 8 feature. Think about how long the T&L engine has been out and yet the number of games that use it are not that many.
    5. NVIDIA may be able to implement some of the new DirectX 8 features into their drivers for the Geforce2. Simular to how they did FSAA. Even though the features will be a software implementation, they will still work and might even end up being faster than the Radeon.

    Tsk tsk tsk. My friend, there two major differences between the technology of ATI's Radeon and nVidia's GeForce cards.

    Radeon --> Proactive technology
    GeForce --> Reactive technology

    There's a lot of technological advances built into the hardware of the Radeon. In the long run, the GeForce will be running out of steam... stuck in third gear, while the Radeon conservatively reclines the heated seats and shifts into fourth.

    Let's compair it to a car. My Pontiac Grand Prix GTP is governed to 127MPH. A Chrysler M300 is a bit higher than that. Becuase my car has so much torque, I can beat the M300 on a 1/4 mile track. But on the highway, that same M300 is eventually going to pass me once it hits 128MPH+.

    Get my point?

    By Ymaster November 10, 2000, 05:50 PM

    DAM! Dx8 is fast!!!!

    My jediknight2 game jumped big time. I'm at 1280x1024 full everything and getting 60 fps!!!..Whats going to blow your mind is the hardware. V3 + K2-550 with 1.06drivers...

    By Jeff Golds November 10, 2000, 06:54 PM

    quote:Originally posted by Narchwoogle:
    In the future, most games will run better with the NVIDIA Geforce 2 GTS than with the ATI Radeon. Here are some reasons why.
    1. There are a lot more Geforce 2 cards out there than Radeons. Most developers are going to use Geforce2 cards as their primary development platforms. This will makes games run faster with the Geforce2.
    2. Half the games that are comming out now days use OpenGL. Geforce is better in OpenGL.
    3. The DirectX 8 features of the Radeon are not game proven in the real world. For all you know ATI may have done a poor job implementing them. In the end, using them on the Radeon may actually slow down the game significantly and will end up being disabled for the most part when using the Radeon.
    4. The deveopers have to actually write the game to use the new DirectX 8 feature. Think about how long the T&L engine has been out and yet the number of games that use it are not that many.
    5. NVIDIA may be able to implement some of the new DirectX 8 features into their drivers for the Geforce2. Simular to how they did FSAA. Even though the features will be a software implementation, they will still work and might even end up being faster than the Radeon.

    Either you work for nvidia or you completely by into their marketing. I particularly like #5, especially when you compare it to #4.

    Oh, regarding #3, if you disable the feature on the Radeon, then that leaves you in the same boat as the GeForce, so how is this supposed to be to nvidia's advantage?

    -Jeff

    By BuggyLoop November 10, 2000, 07:07 PM

    5. NVIDIA may be able to implement some of the new DirectX 8 features into their drivers for the Geforce2. Simular to how they did FSAA. Even though the features will be a software implementation, they will still work and might even end up being faster than the Radeon.

    Software beating hardware? i think you are dreaming right now, wake up!

    By AssNasty November 10, 2000, 07:41 PM

    Hey Narchwoogle, all of your talk is just speculation and heresay until people are actually using DX8 so quit talking out of your ass man, thats my job!

    Patience is a virtue Narch, so just simmer down till we know for sure which card will rule Mount Olympus!

    By tomdepeche November 10, 2000, 07:56 PM

    Okay one second... I don't think that the Radeon is capable of outperforming Ultra version so why are we even bothering about which company provides the fastest card.... One other thing... Why are so many people hooked into the "fastest" card? One BIG thing that makes me decidew which one I want is Image quality... It's like talking about American cars that are fast... If they can't handle the corners, they are worthless to me!

    By BuggyLoop November 10, 2000, 08:04 PM

    Some test have been done with Dx8 already.
    using WINTUNE, wintune makes direct calls to the Dx8 api, unlike madonion or whatever benchmark you can find.

    Radeon scored 256 % better than a geforce pro DDr, so yea, once we see aps that use dx8, radeon will probably outperform a geforce ultra.

    Also the known test using quake3 with 10 times the poly count and 4 times the texture load, result.
    Athlon 1.1 with GF2 Ultra: 40.3
    Athlon 700 with Radeon 64DDR: 77.2

    thanks to hyper-z

    I would tell anyone that have the money to upgrade each 4 months to buy a geforce, but when i buy a card, i want it to last at least 1 year and half up to 2 years, thats where radeon win.

    By Touma November 10, 2000, 09:10 PM

    Most likely, because right now the Radeon is using only 66.666% of it's potential. Every Game today only uses 2 of the 3 texture units which the Radeon it capable of processing. That means at the very leas a 33.333% performance boost for future games over a Gforce2. And I'm talking before all the tweaking and optimization has happened.

    By Humus November 10, 2000, 09:34 PM

    quote:Originally posted by Azuth:
    Just out of curiosity, I have a question for the video card gurus that frequent this board. I remember reading somewhere(Tom's HW I think it was) that the GeForce does not have hardware environment bumpmapping support and that the Radeon does bumpmapping like 10,000 times faster than the GeForce. Is that true?

    I'm asking because bumpmapping is one feature that I KNOW we will see become standard in most new games. If I remember correctly, that is one of the new features of DX8 also. Does anyone know more about this?

    Yes, it's true. The GF does not support EMBM. While the software implementation got a score of ~5 fps the Radeon did it with 1600 fps.

    By Humus November 10, 2000, 09:37 PM

    quote:Originally posted by tomdepeche:
    Okay one second... I don't think that the Radeon is capable of outperforming Ultra version so why are we even bothering about which company provides the fastest card.... One other thing... Why are so many people hooked into the "fastest" card? One BIG thing that makes me decidew which one I want is Image quality... It's like talking about American cars that are fast... If they can't handle the corners, they are worthless to me!

    Yes, image quality is underestimated. A Radeon rendering at 1280x1024 produces a much better output than a Ultra at 1600x1200, and it's faster too.
    nVidias shitty image quality have kept me away from their cards a long time.

    By Captain Iglo November 11, 2000, 05:08 AM

    quote:Originally posted by Narchwoogle:
    In the future, most games will run better with the NVIDIA Geforce 2 GTS than with the ATI Radeon. Here are some reasons why.
    1. There are a lot more Geforce 2 cards out there than Radeons. Most developers are going to use Geforce2 cards as their primary development platforms. This will makes games run faster with the Geforce2.
    2. Half the games that are comming out now days use OpenGL. Geforce is better in OpenGL.
    3. The DirectX 8 features of the Radeon are not game proven in the real world. For all you know ATI may have done a poor job implementing them. In the end, using them on the Radeon may actually slow down the game significantly and will end up being disabled for the most part when using the Radeon.
    4. The deveopers have to actually write the game to use the new DirectX 8 feature. Think about how long the T&L engine has been out and yet the number of games that use it are not that many.
    5. NVIDIA may be able to implement some of the new DirectX 8 features into their drivers for the Geforce2. Simular to how they did FSAA. Even though the features will be a software implementation, they will still work and might even end up being faster than the Radeon.

    i cannot proove the contrary, because all u say might come true - implementing DX8 just for the show is a possibility!

    of course, my claim concerning overdraw cannot be proven either, but there's an indication, that this will come true: Villagemark2000,where the Radeon 64MB scores 58 vs 33 fps against the GForce2 GTS(@1024*768*32 i guess).

    after all, all of this is just a lucky guess.

    By Captain Iglo November 11, 2000, 05:14 AM

    quote:Originally posted by Azuth:
    Just out of curiosity, I have a question for the video card gurus that frequent this board. I remember reading somewhere(Tom's HW I think it was) that the GeForce does not have hardware environment bumpmapping support and that the Radeon does bumpmapping like 10,000 times faster than the GeForce. Is that true?

    I'm asking because bumpmapping is one feature that I KNOW we will see become standard in most new games. If I remember correctly, that is one of the new features of DX8 also. Does anyone know more about this?

    i cannot confirm this 10.000-times-as-fast-as-a-Geforce theory, but i know that even a GTS does not have all DX7 features implemented, one good example being Environmental BumpMapping, in fact Texture Compression was part of DX6, but TNT2s don't support it, although they 'sticker says' DX6 compliant!

    By Mighty Mighty Me November 11, 2000, 05:43 AM

    I honestly am not too sure that the Radeon's 3rd Pipeline is going to be implemented over the 2 pipeline setup, as for the 2 line setup makes more sense than the three, so I wouldn't bank on that, suddenly making the Radeon faster.
    Besides seriously, how many features of Direct X 7.0 honestly get used, realistically, not very many, and I don't think that trend is going to change real soon.

    If your gonna praise the Radeon, praise it for its great Image Quality, DVD Playback, and cost. Not on, once Direct X8 comes into play its gonna be so much faster, and everything is gonna support this. Betting on the future with video cards is a risky business, how many features have we all seen never used?

    By Humus November 11, 2000, 10:35 AM

    quote:Originally posted by Mighty Mighty Me:
    I honestly am not too sure that the Radeon's 3rd Pipeline is going to be implemented over the 2 pipeline setup, as for the 2 line setup makes more sense than the three, so I wouldn't bank on that, suddenly making the Radeon faster.
    Besides seriously, how many features of Direct X 7.0 honestly get used, realistically, not very many, and I don't think that trend is going to change real soon.

    If your gonna praise the Radeon, praise it for its great Image Quality, DVD Playback, and cost. Not on, once Direct X8 comes into play its gonna be so much faster, and everything is gonna support this. Betting on the future with video cards is a risky business, how many features have we all seen never used?

    I'm pretty damn sure it's gonna be used. John Carmack (lead programmer of Quake 3 if someone didn't know) have already stated that he will put the third texturing unit to good use. Adding support for the third texturing unit is quite simple, so why not do it when it boost the framerate about 20-30%?
    Probably most future cards are going to have three or more texture units, Radeon will keep up with those a little longer while GTS will fall of rather quick.

    By Emilee November 11, 2000, 11:51 AM

    quote:Originally posted by BuggyLoop:
    Some test have been done with Dx8 already.
    using WINTUNE, wintune makes direct calls to the Dx8 api, unlike madonion or whatever benchmark you can find.

    Radeon scored 256 % better than a geforce pro DDr, so yea, once we see aps that use dx8, radeon will probably outperform a geforce ultra.

    Also the known test using quake3 with 10 times the poly count and 4 times the texture load, result.
    Athlon 1.1 with GF2 Ultra: 40.3
    Athlon 700 with Radeon 64DDR: 77.2

    thanks to hyper-z

    I would tell anyone that have the money to upgrade each 4 months to buy a geforce, but when i buy a card, i want it to last at least 1 year and half up to 2 years, thats where radeon win.

    I can get a link to that test? I want to humble my friends

    By Diablo SV November 11, 2000, 12:36 PM

    RADEON ROCKS
    IT`S DX8!!
    I THINK ATi IS THE ONLY ONE WHO HAS A DX8 HARDWARE!! SUPPORT (NOT LIKE 3DFX OR nVIDIA BY DRIVER`S!!) AND I THINK DX8 WILL BOOST THE RADEON PERFORMANCE.
    AND YES BEFORE I FORGET I`VE GOT A PICTURE FOR THE RADEON MAXX IT`S HAS DUAL-VGA AND 3G\TEXEL!!....................

    By Humus November 11, 2000, 01:33 PM

    quote:Originally posted by Emilee:
    I can get a link to that test? I want to humble my friends

    http://www.aceshardware.com/Spades/read_news.php?post_id=15000330&keyword_highlight=10x

    By Emilee November 11, 2000, 02:06 PM

    Sweet, I love my Radeon, just wanted more ammo

    By Narchwoogle November 11, 2000, 03:56 PM

    quote:Originally posted by Jeff Golds:
    Either you work for nvidia or you completely by into their marketing. I particularly like #5, especially when you compare it to #4.

    Oh, regarding #3, if you disable the feature on the Radeon, then that leaves you in the same boat as the GeForce, so how is this supposed to be to nvidia's advantage?

    -Jeff

    #3 is to NVIDIA's advantage because they write far superior drivers for their software. If both ATI and NVIDIA have a software implementation of a feature, you had better bet that NVIDIA's will be much better.

    By Narchwoogle November 11, 2000, 03:59 PM

    quote:Originally posted by The Grinch:
    Tsk tsk tsk. My friend, there two major differences between the technology of ATI's Radeon and nVidia's GeForce cards.

    Radeon --> Proactive technology
    GeForce --> Reactive technology

    You think the Transform and Lighting engine was not proactive? Geforce is proactive with their technology too. It's just that they have not actually came out with a new graphics chip in a while.

    By blankman November 11, 2000, 04:35 PM

    Narch, man, why don't you just concede. ATi has come out with what like 4 drivers in 4 months or so? Those drivers settled compatibility issues and raised performance by about 5 to 10%. Saying that ati makes shitty drivers is no longer a valid argument. Yes they couldn't write drivers for shit back when rage 128 came out, but the truth is that they write good drivers now.

    By The Grinch November 11, 2000, 04:47 PM

    quote:Originally posted by Narchwoogle:
    You think the Transform and Lighting engine was not proactive? Geforce is proactive with their technology too. It's just that they have not actually came out with a new graphics chip in a while.

    I was wondering how long it would take for you to reply to that.

    Some believe that the Radeon card will speed up once the new DirectX8 features are supported. I personally don't believe an increase will happen due to the fact that the card now has to do a bit of extra work. I also believe that the card won't slow down either since it has the extra technology to handle it. The GeForce however, won't have these extra little motors inside it to compensate. So I believe that while the Radeon will stand still in performance, the Geforce cards will slow down due to the extra load it was never designed to do. Unless nVidia bumps the clock speed "again", only the next gen nVidia cards will see speed improvements.

    By cracKrock November 11, 2000, 05:33 PM

    ATi has gotten their stuff together with the Radeon card. They know that good drivers will make or break this card. No matter how good the hardware is, they had to produce good drivers to run it.

    Take a look at PowerVR chips. They are truly revolutionary with their tile-based rendering technique. But they have tons of problems with their drivers to the extent that it keeps people from giving them good reviews or buying them.

    So, let ATi have their day in the sun. nVidia will probably recapture the video card crown with their next offering. That is, until the Radeon II comes out. hehe. :-)


    ...
    check out: http://www.tranzor.net

    By Narchwoogle November 12, 2000, 12:15 AM

    quote:Originally posted by blankman:
    Narch, man, why don't you just concede. ATi has come out with what like 4 drivers in 4 months or so? Those drivers settled compatibility issues and raised performance by about 5 to 10%. Saying that ati makes shitty drivers is no longer a valid argument. Yes they couldn't write drivers for shit back when rage 128 came out, but the truth is that they write good drivers now.

    No, why don't you wake up and look beyond your ATI bias? The ATI Radeon doesn't work with Windows 2000 + AMDs + Quake III or MDK 2. Check out Sharky's article on the ASUS V7100. ATI has driver problems with the Radeon.
    http://www.sharkyextreme.com/hardware/reviews/video/asus_v7100_mx/8.shtml

    By BuggyLoop November 12, 2000, 12:55 AM

    god narch, search a little before posting a benchmark result nearly older than win2k itself (exagerating) but its the truth.
    7020+dx8+win2k = fine

    By Narchwoogle November 12, 2000, 11:09 AM

    quote:Originally posted by BuggyLoop:
    god narch, search a little before posting a benchmark result nearly older than win2k itself (exagerating) but its the truth.
    7020+dx8+win2k = fine

    OMG... Those benchmarks where done LAST WEEK by Sharky! Here is the link to the first page of the article that shows a date of November 9th. That's only a few days old. http://www.sharkyextreme.com/hardware/reviews/video/asus_v7100_mx/
    So, quit trying to discredit me by making me look like I don't know what I'm talking about. I'm simply presenting the facts. They are not older than Windows 2000. Quake 3 and MDK 2 and Evolva are all having problems under Win2K with the Radeon just a few days ago right here on Sharky's. This is where the ATI drivers have their problems.

    As far as DirectX 8, the Radeon, and Windows 2000, how can you say it's fine? Nobody has done any benchmarks on it yet? For all you know it might be even more buggy than it was with DirectX 7! The fact remains, you don't know and from looking at their track record I'm betting that there are going to be even more stability problems.

    Here is where the ATI Radeon fails with Quake III Arena MAX AMD Duron 700Mhz Win2K Pro. http://www.sharkyextreme.com/hardware/reviews/video/asus_v7100_mx/8.shtml

    This is where ATI Radeon fails whith MDK 2 Max 32-bit T&L On Intel Celeron 700Mhz Win2K Pro. AND it fails with MDK 2 Max 32-bit T&L On AMD Duron 700Mhz Win2K Pro. http://www.sharkyextreme.com/hardware/reviews/video/asus_v7100_mx/10.shtml

    Here is where the Radeon fails again Evolva 1024x768x32 Windows 2000 Pro. http://www.sharkyextreme.com/hardware/reviews/video/asus_v7100_mx/11.shtml

    Are we starting to see a pattern here or do I need to spell it out for you? ATI Radeon under Windows 2000 has stability problems big time. Don't get an ATI Radeon unless you plan on not using Windows 2000.

    By Humus November 12, 2000, 11:56 AM

    quote:Originally posted by Narchwoogle:
    OMG... Those benchmarks where done LAST WEEK by Sharky! Here is the link to the first page of the article that shows a date of November 9th. That's only a few days old. http://www.sharkyextreme.com/hardware/reviews/video/asus_v7100_mx/
    So, quit trying to discredit me by making me look like I don't know what I'm talking about. I'm simply presenting the facts. They are not older than Windows 2000. Quake 3 and MDK 2 and Evolva are all having problems under Win2K with the Radeon just a few days ago right here on Sharky's. This is where the ATI drivers have their problems.

    As far as DirectX 8, the Radeon, and Windows 2000, how can you say it's fine? Nobody has done any benchmarks on it yet? For all you know it might be even more buggy than it was with DirectX 7! The fact remains, you don't know and from looking at their track record I'm betting that there are going to be even more stability problems.

    Here is where the ATI Radeon fails with Quake III Arena MAX AMD Duron 700Mhz Win2K Pro. http://www.sharkyextreme.com/hardware/reviews/video/asus_v7100_mx/8.shtml

    This is where ATI Radeon fails whith MDK 2 Max 32-bit T&L On Intel Celeron 700Mhz Win2K Pro. AND it fails with MDK 2 Max 32-bit T&L On AMD Duron 700Mhz Win2K Pro. http://www.sharkyextreme.com/hardware/reviews/video/asus_v7100_mx/10.shtml

    Here is where the Radeon fails again Evolva 1024x768x32 Windows 2000 Pro. http://www.sharkyextreme.com/hardware/reviews/video/asus_v7100_mx/11.shtml

    Are we starting to see a pattern here or do I need to spell it out for you? ATI Radeon under Windows 2000 has stability problems big time. Don't get an ATI Radeon unless you plan on not using Windows 2000.

    Since they are using so old drivers I wouldn't expect better results. If you put the MX into this text with it's shipping drivers it wouldn't be especially impressing either.

    By blankman November 12, 2000, 01:32 PM

    Narch, buddy, please take a look at page 4 of the review that you have provided. radeon was using the 3035 drivers. Those are the old ones. The newest ones are the 7020. if you don't believe me, here's the url.
    http://www.sharkyextreme.com/hardware/reviews/video/asus_v7100_mx/4.shtml

    Now take a look at the newest ATi drivers.
    http://www.rage3d.com/files/drivers/radeon.shtml

    As you can see, sharky was using OLD drivers.

    By Mighty Mighty Me November 12, 2000, 02:57 PM

    Somewhere along the line people seem to have forgotten how little DirectX features seem to get supported. Its not like in 2 weeks, we're gonna see a bunch of games supporting every feature. As for the third pipeline, I guess will have to wait and see which developers (Cards and games) decide to support, I think we'll see 2, then 4, then 6,etc but what do I know.

    Honestly the Radeon and the Geoforce are 2 damn good cards that just about equal each other in alot of respects. What I don't get is when you have such great cheap, smart cards. Why are people still buying V4 &5s?

    By BuggyLoop November 12, 2000, 04:01 PM

    /ignore narch for now on

    At least do some research before replying. The drivers sharky used are older than win2k itself, radeon was out before win2k, at least the drivers that were shipping with it were.
    Since sharky used the oldest driver possible for radeon in win2k, why they didnt with the MX? that beats me, biased like always, i dont read sharky's reviews since a while.

    anyway, since anything you try to prove is that you hate ATI, with all the facts i gave you agaisnt your V5 or Geforce, well i guess i can just ignore your ignorance

    *v5 is the best card in the world!* rofl

    By Phatman November 12, 2000, 04:14 PM

    Nobody likes you Narchwoogle...

    By Humus November 12, 2000, 08:58 PM

    quote:Originally posted by Mighty Mighty Me:
    As for the third pipeline, I guess will have to wait and see which developers (Cards and games) decide to support, I think we'll see 2, then 4, then 6,etc but what do I know.

    I assume you mean the third texturing unit. It seams to me that a lot of people are doubting the usefulness of a third texturing unit. I do a lot of programming in OpenGL, and I'd take a fourth texturing unit too.
    Adding support for the third texturing unit is easy, and since it may boost performance 20-30% I cannot see any reason to not do it.

    By boob November 13, 2000, 01:00 AM

    I should be getting my Radeon 32MB DDR in the mail tomorrow, though the rest of my new system isn't ordered yet I'm really looking forward to checking it out though. I paid $138 for it shipped through buy.com (used a 30 dollar off coupon). Awesome buy and an awesome card if you ask me

    By Narchwoogle November 13, 2000, 11:30 AM

    quote:Originally posted by blankman:
    Narch, buddy, please take a look at page 4 of the review that you have provided. radeon was using the 3035 drivers. Those are the old ones. The newest ones are the 7020. if you don't believe me, here's the url.
    http://www.sharkyextreme.com/hardware/reviews/video/asus_v7100_mx/4.shtml

    Now take a look at the newest ATi drivers.
    http://www.rage3d.com/files/drivers/radeon.shtml

    As you can see, sharky was using OLD drivers.

    Ahh. Thanks for explaining this. Finally, someone who makes an honest effort to make everything clear. You explained this in a well written post instead of simply blurting out insults in fragmented sentences. I wonder why Sharky did that? He should have downloaded the newer drivers for that review. I'm disappointed that Sharky would do such a thing.

    By Narchwoogle November 13, 2000, 11:33 AM

    quote:Originally posted by Phatman:
    Nobody likes you Narchwoogle...

    LOL. My feelings are hurt. :P

    By Kitty FishSticks November 13, 2000, 12:07 PM


    RE: RADEON DX8 SUPPORT VERSUS NVIDIA

    MY REPLY : "I like cheese"

    'nuff said.

    By Humus November 13, 2000, 12:27 PM

    quote:Originally posted by Narchwoogle:
    Ahh. Thanks for explaining this. Finally, someone who makes an honest effort to make everything clear. You explained this in a well written post instead of simply blurting out insults in fragmented sentences. I wonder why Sharky did that? He should have downloaded the newer drivers for that review. I'm disappointed that Sharky would do such a thing.

    It's probably that Sharky don't want to use "unofficial" driver. The latest supported drivers are the 3035.

    By Narchwoogle November 13, 2000, 01:47 PM

    quote:Originally posted by Humus:
    It's probably that Sharky don't want to use "unofficial" driver. The latest supported drivers are the 3035.

    Another excellent point. On the ATI site, thier "OFFICIAL DISPLAY DRIVERS for Windows 2000" is version 5.0.3027.

    By ovkearth November 13, 2000, 05:27 PM

    wow not to bad anyway...
    well here's what i think bout that ati vs. nvidia thing...
    as everyone should now ati is the leading manufacturer in the OEM market, they even earn more money than nvidia+3dfx together.
    they did a really good job with the Radeon as far as i can judge this (got a mx2) but there is a BIIG and rather important difference between these two pieces of hardware ... the geforce 2 gts wutevr is based on the (RIP) chip layout of nvidias geforce chip, comparing a almost brand new (related to availabality) Radeon to a new geforce pro is even worse, as it isnt really more than a tuned up geforce 2 ... due to the ATIs huge budget and their leading position in the consumer market it would have been a shame for them to produce a worse chip than the significantly older nvidia counterpart ...
    if u bought a geforce2gts some time ago u did a really great deal and have still one of the fastest cards on the market ...
    thus i'd never think of buying a RADEON, when there's a new chip generation just waiting to be revealed .....
    if u have to upgrade your system right now (well who doesnt have to) have a closer look and decide by yourself, well
    the radeon has interesting features which it will inherit to the following generations and this may give ATI the pole in further times.
    one last thing i wanted to point out is the rather low memory bandwith of topical boards... i guess significant improvements won't be achieved unless there are affortable memory solutions available ...
    so far, thanks for reading ......

    By Azuth November 13, 2000, 06:12 PM

    So what you are saying, ovkearth, is that you should always buy the next generation and not the current one?

    Sorry, but I prefer to actually have a computer I can use. There comes a time when you actually need to go to the store rather than wait till the next generation.

    Have you seen a release date for the Radeon II or the NV20? Do you even know if they will cost less than $400? I already heard the NV20 was going to be an expensive one. That means we've got about a year to wait till we see another budget Nvidia card.

    Unless you've got a release date, price, and performance specs on a card coming out in the next month, then you've really gotta just consider what is out now. When you've got an old TNT like I did a month a ago, then you've gotta make a decision 'cause waitin just isn't an option anymore.

    By Stryfe November 13, 2000, 07:34 PM

    I'm a bit skeptical about new features, how long has the geForce been out, and how many games support the T&L engine? I have no doubt, the Radeon is an awesome card (much better than my MX2 anyway )but, talk about the feature set that's available now, and the advantages the card has NOW over the Nvidia products. This whole 'directx 8 is gonna make my Radeon the best card on the plant for the next 2 years' is ridiculous. In the next year there'll be new, better cards from both companies. There's always something new. For people getting a new card, get the best card you can for the amount of money you feel comfortable spending. The longevity of one card over another is probably closer to a few months than a year or more.

    By Bog_Trooper November 13, 2000, 09:16 PM

    Ati is not the oem leader!

    Nvidia is, they have about 50%market share now compared to ATI's dwindling 34% share.

    Toshiba, the worlds largest laptop maker just announced their award of their oem contract to Nvidia's Gforce2 go chip.

    ATI did not make more money than Nvidia, they have been off their estimates last 3+quarters....Nvidia has doubled their revenue each quarter for 3+quarters strait.

    link
    http://yahoo.cnet.com/news/0-1006-200-3607766.html?pt.yfin.cat_fin.txt.ne

    By ovkearth November 14, 2000, 07:50 AM

    allright ...
    i apologize for posting wrong facts, however ATI is a big player in the oem market, it's not important whether they are still no 1 or may have been overtaken by nvidia due to their new mobile gpu (geforce 2 go) or wutevr...
    i never wanted to say, that u have to wait for the next generation graphics chips , i just wanted to explain the ongoing increase in quality and performance ...
    i bought a hercules prophet 2 mx bout 3 month ago, well i had to pay 200$ because the card was out of stock almost everywhere (damn) ...
    q3 and most other games also did fine on my tnt based card, but due to the increased use of hardware t&l (games based on the q3 engine) i decided to get myself a better card, but why buy a 500$+ card, when u can get enough power for 200$- bucks anyway ...
    i won't buy a new card unless nvidia introduces their new generation and even then i'll wait for decreasing prices , perhaps i'll get an RADEON DDR then, should be much cheaper ...
    the radeon might be (MORE) interesting to upgrade in half an year, due to it's great features, if u have to upgrade now, well buy wutevr u like, guess u make a good choice
    ok then, hope u got my point now ... the radeon chip is newer and better but someday nvidia has the newer chips a.s.o ....
    both companies produce great chips i cannot understand anyone saying, ATI is better than nvidia or the other way arround...
    thankz ovk...

    By Humus November 14, 2000, 11:09 AM

    quote:Originally posted by Stryfe:
    I'm a bit skeptical about new features, how long has the geForce been out, and how many games support the T&L engine? I have no doubt, the Radeon is an awesome card (much better than my MX2 anyway )but, talk about the feature set that's available now, and the advantages the card has NOW over the Nvidia products. This whole 'directx 8 is gonna make my Radeon the best card on the plant for the next 2 years' is ridiculous. In the next year there'll be new, better cards from both companies. There's always something new. For people getting a new card, get the best card you can for the amount of money you feel comfortable spending. The longevity of one card over another is probably closer to a few months than a year or more.

    The reason T&L hasn't been used much is because it should been taken into account from the beginning of the development of a game to make any significant difference. Using T&L while maintaining backward compatibility with older cards is a little more difficult task than adding support or say the Radeons third texturing unit while maintaining compatibility with a two or single texturing unit setup.
    DX8 isn't what's going to make the Radeon better in the closer future, it's the higher effectivily available memory bandwidth and it's third texturing unit that's gonna make it. Games will continue to get higher resolution textures and more texturing passes and Radeon will have a significant advantage over GTS in such circumstances.

    By Captain Iglo November 14, 2000, 11:22 AM

    any game using OpenGL takes advantage of TCL/ T&L, whatever cards, the problem is that games only use some thousands of polygons/sec while vid cards could do ~25 million/sec (GTS - official - 25, Radeon - 30 @200 MHz, (183 or 166)/200)*30 = 27/25 also official spec...
    don't know wether this is correct or not, but both should be able to display at least 10 million/sec - a HUGE step!

    besides, does any game use lighting capabilities of these cards? Radeons can display 8 light sources at a time (i know NV cna do that do, but don't remember how many exactly).

    By Stryfe November 14, 2000, 12:22 PM

    The point is, the game companies have to keep the low end market in mind when they produce games. Your game won't sell well if you aim towards the high end and forget everyone else. So the logical conclusion for people buying/upgrading hardware, is speed over new features that aren't being used by anyone now. Of course display quality is important also. I got a TNT2 Ultra last year because it supoorted 32bit gaming. That never really panned out on with card, because by the time games REALLY supported 32bit, that card was too slow to run them at a good res/framerate. That's not a mistake I'd make again. The geForce is the more popular card (by sales) and gaem development for the next year or so, I'm sure, will reflect that. The game co's would love to use the latest greatest features available, but that's not what sell games.

    By cracKrock November 14, 2000, 12:45 PM

    The Radeon's new "features" won't be implemented for another generation of games or so, but ATi has introduced new video card technology that will improve your performance with *all* games - HyperZ.

    By Humus November 14, 2000, 02:39 PM

    quote:Originally posted by Captain Iglo:
    any game using OpenGL takes advantage of TCL/ T&L, whatever cards, the problem is that games only use some thousands of polygons/sec while vid cards could do ~25 million/sec (GTS - official - 25, Radeon - 30 @200 MHz, (183 or 166)/200)*30 = 27/25 also official spec...
    don't know wether this is correct or not, but both should be able to display at least 10 million/sec - a HUGE step!

    besides, does any game use lighting capabilities of these cards? Radeons can display 8 light sources at a time (i know NV cna do that do, but don't remember how many exactly).

    True, but I was thinking more about the level design stuff. To be able to have both support for T&L and have a low polygon count mode will require quite a flexible editor/game engine. That takes it's time.

    Not much games uses the lighting capabilities of the cards, but they are more videly used by professional 3d applications. Most game engines uese lightmapping, but vertex lighting may be used on model or other non-static objects sometimes.

    By BuggyLoop November 14, 2000, 02:53 PM

    Ok , after all this talk about technology, i think everyone can agree that we CANT wait to see those features in the next gen of games, it will be soooooo beautifull

    By Jeff Golds November 14, 2000, 03:23 PM

    quote:Originally posted by Captain Iglo:
    any game using OpenGL takes advantage of TCL/ T&L, whatever cards,

    I don't think Unreal gets much benefit, it's mostly CPU bound. It's engine was designed for software rendering, D3D and OpenGL were an afterthought.

    quote: the problem is that games only use some thousands of polygons/sec while vid cards could do ~25 million/sec (GTS - official - 25, Radeon - 30 @200 MHz, (183 or 166)/200)*30 = 27/25 also official spec...
    don't know wether this is correct or not, but both should be able to display at least 10 million/sec - a HUGE step!

    I seriously doubt that today's cards have the bandwidth to support so many polygons/sec.

    quote:
    besides, does any game use lighting capabilities of these cards? Radeons can display 8 light sources at a time (i know NV cna do that do, but don't remember how many exactly).

    No, all PC games I am aware of use lightmaps. Part of the problem with HW lighting is that it is not very realistic. HW lighting is based on the OpenGL lighting model which leaves much to be desired in terms of realism.

    Look at movies like "Toy Story". The lighting model used is incredibly complex and can't be imitated with current HW lighting. with lightmaps, however, you can precompute a bunch of data that might give better results.

    -Jeff


    Contact Us | www.SharkyForums.com

    Copyright © 1999, 2000 internet.com Corporation. All Rights Reserved.


    Ultimate Bulletin Board 5.46

    previous page
    next page





    Copyright © 2002 INT Media Group, Incorporated. All Rights Reserved. About INT Media Group | Press Releases | Privacy Policy | Career Opportunities