Home

News

Forums

Hardware

CPUs

Mainboards

Video

Guides

CPU Prices

Memory Prices

Shop



Sharky Extreme :


Latest News


- Outdoor Life: Panasonic Puts 3G Wireless Into Rugged Notebooks
- Averatec Launches Lightweight Turion 64 X2 Laptop
- Acer Fires Up Two New Ferrari Notebooks
- Belkin Debuts Docking Station for ExpressCard-Equipped Notebooks
- Logitech 5.1 Speaker System Puts Your Ears At Eye Level
News Archives

Features

- SharkyExtreme.com: Interview with ATI's Terry Makedon
- SharkyExtreme.com: Interview with Seagate's Joni Clark
- Half-Life 2 Review
- DOOM 3 Review
- Unreal Tournament 2004 Review

Buyer's Guides

- September High-end Gaming PC Buyer's Guide
- September Value Gaming PC Buyer's Guide
- October Extreme Gaming PC Buyer's Guide

HARDWARE

  • CPUs


  • Motherboards

    - Gigabyte GA-965P-DS3 Motherboard Review
    - DFI LANPARTY UT nF4 Ultra-D Motherboard Review

  • Video Cards

    - Gigabyte GeForce 7600 GT 256MB Review
    - ASUS EN7900GT TOP 256MB Review
    - ASUS EN7600GT Silent 256MB Review
    - Biostar GeForce 7900 GT 256MB Review





  • SharkyForums.Com - Print: ATi's new chip to beat GeForce3

    ATi's new chip to beat GeForce3
    By RadeonMAXX March 03, 2001, 11:42 PM

    According to an interview at ebns.com with David E. Orton, ATi's president and chief operating officer, ATi's new GPU will beat GeForce3's specifications and will offer a programming capability as well.

    Although he didn't give a specific date for the release of ATi's next gen. chip, he did say it would be this year - (like that really helps those of you who are foaming at the mouth right now).

    You can check out the entire article here.... http://www.ebns.com/story/OEG20010302S0094

    By BuggyLoop March 03, 2001, 11:47 PM

    oh i dont worry about ati's hardware, they've been ahead of nvidia with radeon in term of technology, but what about the software? i sure damn hope they get rid of the rage128 driver architecture! or i'll be one angry dude.

    By RLX7 March 03, 2001, 11:53 PM

    Hopefully ATi can improve their driver support for the current cards as well as the new cards to come out. This will really help them get dep into the mainstream of the high and middle end of GPU's!

    By Shades March 04, 2001, 12:07 AM

    The only thing that's preventing me from considering buying the Radeon2 is the crappy drivers.

    I hope ATi will fix them soon...real soon.

    By Willy-Willy March 04, 2001, 12:44 AM

    The only way to get ATI back in the whole game is make a damn good kick ass driver!!

    By James_Emerson March 04, 2001, 12:45 AM

    Hell, if i was the president of ATI i would have say the samething. "My card is better than GF3"!. Who wouldn't? Do you think would Bill Clinton, would say i had sex with that woman? Hell no.. hehehe

    By jagojago12 March 04, 2001, 01:12 AM

    I thought the Radeon SE was fake..?

    By Cam March 04, 2001, 08:57 AM

    Of course he would say that. That way you won't buy the GF3 and wait for an ATI card to see. Its called business strategy.

    By tomdepeche March 04, 2001, 09:15 AM

    What's wrong with ATi's drivers? Never had any problems with all the games that I have tried so far!

    By Bog_Trooper March 04, 2001, 09:31 AM

    you must have a limited supply of games...some games like sacrafice dont even run on radeons (apparently).

    My take on that statement and the unknown release date of the radeon II is

    Lets wait to see what they (NVIDIA)put on the table, and then take 6 months extra to engineer our card to their specs and add any little extra feature we can so we can top their card. ATI has always lacked vision and leadership in the 3d market....they dont have the imagination that 3dfx and Nvidia had to innovate. Dont use the "Nvidia just clocks it faster, thats not innovation" argument either...every year they have substantially upped the ante.

    Nvidia could say "our next card will beat the current card" and we would all say DUH??!! So why do Ati fans who all own broken radeon chips go "Ohh and Ahh and see see...I told you" when ATi says "our next card will be better than the current cards"?

    Duhh!!??

    We'll see when they get it to market.

    God im an editing monkey....
    ps. expected profit loss will not help them get it to market quickly either.

    By DRYNDRYN2 March 04, 2001, 09:44 AM

    quote:Originally posted by Bog_Trooper:
    you must have a limited supply of games...some games like sacrafice dont even run on radeons (apparently).

    My take on that statement and the unknown release date of the radeon II is

    Lets wait to see what they (NVIDIA)put on the table, and then take 6 months extra to engineer our card to their specs and add any little extra feature we can so we can top their card. ATI has always lacked vision and leadership in the 3d market....they dont have the imagination that 3dfx and Nvidia had to innovate. Dont use the "Nvidia just clocks it faster, thats not innovation" argument either...every year they have substantially upped the ante.

    Nvidia could say "our next card will beat the current card" and we would all say DUH??!! So why do Ati fans who all own broken radeon chips go "Ohh and Ahh and see see...I told you" when ATi says "our next card will be better than the current cards"?

    Duhh!!??

    We'll see when they get it to market.

    God im an editing monkey....
    ps. expected profit loss will not help them get it to market quickly either.


    Oh, great inovation. They wait to see what the competition has, and then steals it. Thats very inovative. So whats wrong then with ATI doing the same thing???????? If Nvidia can, so why not and ATI???

    And sacrafice runs with RADEON, there is a problem in 32Bit color only, it's normal in 16Bit.

    Thats BS. Whats broken in the RADEON chip?????

    By RadeonMAXX March 04, 2001, 09:50 AM

    Broken Radeon chip? Please explain to me what's broken about it? Profit loss? Who hasen't reported some type of loss latley? The stock market has been getting slammed hard, tech & dot coms stocks are obviously getting it the worst. LOL...Sacrifice don't run on the Radeon? Sacrifice runs smooth as a babies ass on my Radeon32mb DDR. UNDYING puts Sacrifice to shame though God OpenGL freakin' rules! & when Orton says they will beat the GF3 by specs, he most likely means by features & not speed like maybe some of u guys are thinking? The GF3 will support features that the current Radeon has(expect for maybe reprgrammable vertix shaders)...& the RadeonII will be 100% fully DX 8 complaint.

    By Azuth March 04, 2001, 10:37 AM

    Yup, no problems here with Radeon. I run a HUGE variety of games on my PC too, unlike most of the fps freaks on these boards. I'll admit that the win2k drivers aren't perfect yet, but its only a matter of time and I don't use win2k on my gaming system anyway.

    Personally, I think the Geforce 3 is a joke. Its huge price tag limits its market to only a small crowd. All the features that make it special won't even be used until later this year at best. By then the Radeon 2 will be out and boast similar performance for a much lower cost.

    I'd even say that there is a good chance that it will be faster than the Geforce 3 in fps. The current Radeon is right up there with the Geforce 2 Ultra in terms of 32bit performance. The second generation of HyperZ might be enough to make it a faster than the Geforce 3.

    By Elxman March 04, 2001, 01:39 PM

    quote:Originally posted by Bog_Trooper:
    you must have a limited supply of games...some games like sacrafice dont even run on radeons (apparently).

    My take on that statement and the unknown release date of the radeon II is

    Lets wait to see what they (NVIDIA)put on the table, and then take 6 months extra to engineer our card to their specs and add any little extra feature we can so we can top their card. ATI has always lacked vision and leadership in the 3d market....they dont have the imagination that 3dfx and Nvidia had to innovate. Dont use the "Nvidia just clocks it faster, thats not innovation" argument either...every year they have substantially upped the ante.

    Nvidia could say "our next card will beat the current card" and we would all say DUH??!! So why do Ati fans who all own broken radeon chips go "Ohh and Ahh and see see...I told you" when ATi says "our next card will be better than the current cards"?

    Duhh!!??

    We'll see when they get it to market.

    God im an editing monkey....
    ps. expected profit loss will not help them get it to market quickly either.

    hyper-z wasn't innovative enough?

    By 3dcgi March 04, 2001, 02:48 PM

    Many people are saying that nVidia is not innovative or ATI is not innovative. I disagree. Both companies have been innovative in one way or another.

    Graphics cards would not be where they are now with one company alone. 3dfx started a revolution by bringing 3D cards to consumers, but others have brought enhancements. Yes, ATI's Hyper-Z is an innovative approach to reducing memory bandwidth. In my opinion nVidia has picked up as a trailblazer where 3dfx left off.

    They were first to market with a T&L. Even though they knew full well that it would be a while before it would make a performance difference with games. Now they've done it again with programmable shaders.

    Even though Matrox is behind in the 3d race they've innovated with features like dualhead and bump mapping.

    By dBLiSS March 04, 2001, 04:47 PM

    I'm not biased towards either company but i gotta say anyone saying "I told you so about" about Ati being better cause the prez of the company said it's next chip will be better then GF3. Well of course he's gonna say that!! He's not gonna come out and say, "well, it will be almost as good". And another thing, the Radeon2 might not be out for a couple monthes, so they now have plenty of time to see what the GF3 is like then beat it. If Nvidia waited for ATi to release something then a couple monthes later released something new to beat it then no one would be making a big hoop-lah. Anyway, i hope the Radeon2 is faster and cheaper then the GF3 because then I will buy one, even if it is 6 monthes later. I can live with my GF2 Ultra until then. =)

    By Drakh March 04, 2001, 05:07 PM

    quote:Originally posted by RadeonMAXX:
    Broken Radeon chip? Please explain to me what's broken about it? Profit loss? Who hasen't reported some type of loss latley? The stock market has been getting slammed hard, tech & dot coms stocks are obviously getting it the worst. LOL...Sacrifice don't run on the Radeon? Sacrifice runs smooth as a babies ass on my Radeon32mb DDR. UNDYING puts Sacrifice to shame though God OpenGL freakin' rules! & when Orton says they will beat the GF3 by specs, he most likely means by features & not speed like maybe some of u guys are thinking? The GF3 will support features that the current Radeon has(expect for maybe reprgrammable vertix shaders)...& the RadeonII will be 100% fully DX 8 complaint.

    Well John Carmac said in his last .plan update:
    "While the Radeon is a good effort in many ways, it has enough shortfalls
    that I still generally call the GeForce 2 ultra the best card you can buy
    right now, so Nvidia is basically dethroning their own product."

    By Shades March 04, 2001, 05:08 PM

    Fog doesn't work or won't work properly with the radeon.

    Is it the chip or the drivers? If it's the drivers then we are back to my original argument that Ati's drivers suck.

    Like many of you I too am waiting for the coveted unified drivers.

    Also I heard the fog problem was passed down from the rage128 drivers.

    By Bog_Trooper March 04, 2001, 05:40 PM

    John Carmack is the one who said there is something broken in the Radeon Chip....in the same .plan quoted above.

    And no, Hyper Z is not innovative ENOUGH...memory bandwidth reduction is not innovation...its problem solving.....programable pixel shaders and programable GPU's..etc..etc...thats innovative. The bandwidth reduction woulda come anyway.....all companies faced that problem.

    By RadeonMAXX March 04, 2001, 06:14 PM

    Shades it's the drivers...rage128 core. It's a good thing ATi is building a Radeon core driver set...Orton states there making unified drivers...I have a feeling those drivers will be shipping out with the Radeon2. I mean ATi doubled staff on Radeon products which includes driver engineers. Orton take over..... Well whats broken in the Radeon chip? I haven't hear anything about it. Isn't Geforce DXT1 texture compression broken in hardware?

    By Rado March 04, 2001, 06:24 PM

    quote:Originally posted by Bog_Trooper:
    John Carmack is the one who said there is something broken in the Radeon Chip....in the same .plan quoted above.

    And no, Hyper Z is not innovative ENOUGH...memory bandwidth reduction is not innovation...its problem solving.....programable pixel shaders and programable GPU's..etc..etc...thats innovative. The bandwidth reduction woulda come anyway.....all companies faced that problem.

    John Carmack said that something "seems" to be broken.
    And yes, ATi is not very innovative at "replicating" stuff, like the gf3 did by introducing *cough* new technology *cough*, like key frame interpolation,vertex skinning, hyperZ, programmable pixel shaders (yes the Radeon has em).
    So, who is the follower?
    DX8 requires a programmable gpu, so it was not nvidia who thought of that.
    Both companies are following Microsoft's model, and both have their own strengths and weaknesses.

    btw, what is stopping ATi from making good drivers?

    By Humus March 04, 2001, 08:07 PM

    quote:Originally posted by Bog_Trooper:
    John Carmack is the one who said there is something broken in the Radeon Chip....in the same .plan quoted above.

    And no, Hyper Z is not innovative ENOUGH...memory bandwidth reduction is not innovation...its problem solving.....programable pixel shaders and programable GPU's..etc..etc...thats innovative. The bandwidth reduction woulda come anyway.....all companies faced that problem.

    The thing that is "broken" is the way it does filtering. Instead of the transition between mipmap levels going in straight lines it may walk in triangular patterns, sort of. Does it look worse in any way? No. It's just that the transition borders are moving differently then on other cards with bilinear filtering. Enabling trilinear removes the lines completely, and since trilinear is pretty cheap on Radeon it should be enabled anyway.

    HyperZ not innovative? Solving problems isn't innovative if done with smart tech?
    And yeah, the pixel shaders on Radeon are innovative.

    By DRYNDRYN2 March 04, 2001, 08:21 PM

    And Key Frame Interpolation is inovative, and 3D Textures, and many more inovations. And it has the best DVD.

    By AMD_Forever March 05, 2001, 02:29 AM

    quote:Originally posted by 3dcgi:

    They were first to market with a T&L.

    S3 with the savage 2000 had the first on-chip T&L.

    whether or not it worked right however.....

    By DRYNDRYN2 March 05, 2001, 03:17 AM

    quote:Originally posted by AMD_Forever:
    S3 with the savage 2000 had the first on-chip T&L.

    whether or not it worked right however.....

    S3 first announced that, their next card will be with T&L - One day beffore nvidia

    But i think the GeForce was released before Savage2000.

    By Drakh March 05, 2001, 05:20 AM

    Ehm...
    It's not like the graphics card companies are 'inventing' new features or concepts, they are just competing to have them first on chip.

    Naturally a card that come later vill have more features, it's just a matter of technological evolution.

    And many features already existed on professional boards before the game card companies had there try.

    By leoku March 05, 2001, 02:18 PM

    quote:Originally posted by RadeonMAXX:
    According to an interview at ebns.com with David E. Orton, ATi's president and chief operating officer, ATi's new GPU will beat GeForce3's specifications and will offer a programming capability as well.

    Although he didn't give a specific date for the release of ATi's next gen. chip, he did say it would be this year - (like that really helps those of you who are foaming at the mouth right now).

    You can check out the entire article here.... http://www.ebns.com/story/OEG20010302S0094


    Even though he is true, by the time ATI releases Radeon 2, GF3 GTS already beats it up, followed by the heavy blow of GF3 Ultra.

    It looks like ATI is more advanced in bragging than product releases and driver support. Has anyone noticed that Nvidia never bragged about beating Radeon ? They do not have to since it is trueth and they know exactly where ATI's problem is.

    It is easy for ATI to make a comeback. Just redo the Radeon driver, make it work reliably in Win9X, Me, 2K and Linux across the board. Make DirectX 7,8 and OpenGL fly reliably and stably (I don't have enough words to use). If their driver is more stable than Detonator, ATI holds ground.

    I have been waiting ever since the Radeon officially sees the world. I am still waiting and it is the ATI driver that holds me up. I am not asking too much, though. All I want is a fast and stable Windows 2000 driver that can run many 3D games. Can ATI fulfill this humble wish ? We will see for Radeon 2. If it is not, I will hold on for GF3.


    By The Grinch March 05, 2001, 03:39 PM

    I've got a Radeon and the drivers for it are fine. I download a lot of game demos off the internet and have always had great success. D3D or OpenGL is nice and smooth. If you own a GeForce or other brand, don't make judgement on something you don't own. ATI has been releasing and leaking beta drivers like there is no tomorrow.

    The RadeonII may or may not be faster than the GF3. Orton's said that the RadeonII would beat nVidia's specs, not the speed. We'll just have to wait to see how the final products can perform.

    I don't believe that the statement was meant to deter people from running out to buy a GF3. Seriously, if someone can afford to spend $500-$550 on a toy, they're not going to wait. It's called compulsive shopping.

    By 256K March 05, 2001, 04:11 PM

    I have a Geforce2 GTS and like the card alot. Although I also know the radeons a great piece of hardware. I believe Nvidia already knows that ATI, with the Radeons' low price, has hurt there market share.Its my opinion that the new Geforce3 has to come in at a lower price or be discounted rapidly to compete with the Radeon2. Of course that presumes ATI will keep its lower price structure.

    By Cpt. Howdy March 05, 2001, 04:19 PM

    quote:Originally posted by James_Emerson:
    Do you think would Bill Clinton, would say i had sex with that woman? Hell no.. hehehe

    Well, according to Bill's interpretation of sex, he was telling the truth. And maybe, in some weird and sick way that may/may not have to do with penetration, the President of ATi is also telling the truth

    By Humus March 05, 2001, 06:55 PM

    quote:Originally posted by Drakh:
    Ehm...
    It's not like the graphics card companies are 'inventing' new features or concepts, they are just competing to have them first on chip.

    Naturally a card that come later vill have more features, it's just a matter of technological evolution.

    And many features already existed on professional boards before the game card companies had there try.

    Many of the new features are indeed invented by ATi, nVidia etc. For instance, no professional cards AFAIK has implemented any of the shader stuff.

    By Bog_Trooper March 05, 2001, 08:44 PM

    Ok, the stuff people claim the Radeon already has and had first like vertex skinning..etc..etc.. Is it possible it was not created by ATI, but just making it possible to do it in DX8?

    My reason for saying this is I have not heard that ATI licensed anything to microsoft for DX8...only that 3dfx and NVIDIA had...meaning that alot of those functions the radeon supports is technology that Nvidia/3dfx created....but was holding back for the GF3. Again...not innovation on ATI's part....just compatability with others technology.

    Thoughts, reflections?

    And to someone who said all Graphic card companies have posted losses...wrong. Nvidia is the only tech/hardware company I know to make a profit last quarter...even beating the street estimate. All this including the buyout of 3dfx. That is impressive considering the state of PC sales and hardware sales.

    By 3dcgi March 05, 2001, 11:17 PM

    quote:Originally posted by Bog_Trooper:

    My reason for saying this is I have not heard that ATI licensed anything to microsoft for DX8...

    Actually ATI contributed N-Patch tessellation to DX8. I believe NVIDIA contributed the programmable shaders, but I'm not 100% sure.

    By BMGundam March 06, 2001, 12:04 AM

    Hey just so you know DXT1 is STILL broken in hardware in the BRAND SPANKING NEW GF3.

    Here's a link for proof http://www.gamebasement.com/pages/home.asp?nav=articles&id=58

    I know Ati has driver problems but this is worse in my book. They still haven't been able to fix it in hardware even after 3 releases of new hardware.

    By Humus March 06, 2001, 01:26 AM

    It amazed me that they didn't fix it in the GF3, I mean, how hard can it be? Anyone who has read a digital design class should be able to design a S3TC decompressing unit.

    By coolqf March 06, 2001, 02:10 AM

    When I read that the company says 'The product will be released this year"
    I think 'hah! earliest is november.'
    That's assuming no delays.
    I do this also for several games and applications.
    if RadeonII comes in november that's 8 months to work on the card.
    I personally believe that ATI will try their best to take out the product for people to buy the product by October. To miss the christmass season would hurt a lot.

    By rial9 March 06, 2001, 04:00 AM

    quote:Originally posted by Humus:
    It amazed me that they didn't fix it in the GF3, I mean, how hard can it be? Anyone who has read a digital design class should be able to design a S3TC decompressing unit.

    S3TC is not "broken" on the any of the Geforce cards. There are several different methods of using texture compression and the one that quake3 uses can be done in 16 bit only therefore resulting in loss of quality even when in 32 bit. I had an article on this but can't seem to find it now.

    By BuggyLoop March 06, 2001, 04:21 AM

    uh rial, why does my radeon doesnt have that bug in Q3 then? nvidia admitted there is a bug in their hardware, dont try to change people's mind cause if it wasnt that late i would slap you a link in the face, but im lazy

    By Humus March 06, 2001, 07:15 AM

    quote:Originally posted by rial9:
    S3TC is not "broken" on the any of the Geforce cards. There are several different methods of using texture compression and the one that quake3 uses can be done in 16 bit only therefore resulting in loss of quality even when in 32 bit. I had an article on this but can't seem to find it now.

    And that is the exact problem, that it interpolates using only 16bits while all other cards are using at least 32bits. Even though the data is 16bit only there's no reason to only use 16bits for interpolation.

    By Rogue March 06, 2001, 09:02 AM

    I can't wait to see the coming Battle
    Should be intresting...

    By Racer^ March 06, 2001, 10:37 AM

    It looks to me like the specs posted by Nvidia about the GF3 are lower than they will actually be. I believe this because in the past, many companies have released outrageous specs which everyone raved about until the product's release - when everyone found out that the product had to be slowed down in order to make it stable, etc.
    NVidia does NOT want this.

    The GF3 is set to run at 50 Mhz slower than the GF2 Ultra using a smaller production process (.15u as opposed to .18u). This spells for what I believe will be a big surprise when GF3 boards go retail. Sure there will still probably be low end models running at 200-250 Mhz, but the high end ones should quickly exceed 300 Mhz.

    Besides, even if they aren't released any faster, it should be possible to OC them.

    By The Grinch March 06, 2001, 10:40 AM

    I was just reading on Anandtech that DXT1 texture compression bug still exists with the new GeForce3, however, the Detonator 6.50 drivers have a work around... a band-aid solution.

    nVidia should have fixed this in hardware, but they didn't. This means that they were rushing to get their product out first. Guess what happens when companies rush out products... bugs! I wonder what else will creep up over the next month or two...

    ...and for those that don't understand or believe in the DXT1 bug/issue, visit this link for an in-depth explanation. http://www.gamebasement.com/pages/home.asp?nav=articles&id=58

    By The Grinch March 06, 2001, 10:49 AM

    quote:Originally posted by rial9:
    S3TC is not "broken" on the any of the Geforce cards. There are several different methods of using texture compression and the one that quake3 uses can be done in 16 bit only therefore resulting in loss of quality even when in 32 bit. I had an article on this but can't seem to find it now.

    It was nVidia that claimed that there was no bug and that it was working as they designed it to. That's only half right. They were correct is saying that it worked as they designed it, but it's still a bug whether nVidia wants to admit it or not. If you don't want to agree that it's a bug you should at least agree that the design nVidia chose to implement was a bad one.

    I suppose it's up to ones interpretation of the word bug.

    By dBLiSS March 06, 2001, 11:06 AM

    quote:Originally posted by Rado:
    John Carmack said that something "seems" to be broken.
    And yes, ATi is not very innovative at "replicating" stuff, like the gf3 did by introducing *cough* new technology *cough*, like key frame interpolation,vertex skinning, hyperZ, programmable pixel shaders (yes the Radeon has em).
    So, who is the follower?
    DX8 requires a programmable gpu, so it was not nvidia who thought of that.
    Both companies are following Microsoft's model, and both have their own strengths and weaknesses.

    btw, what is stopping ATi from making good drivers?

    Microsoft and Nvidia worked very closely while developing DX8, do you honestly think Microsoft just came up with Programable Shaders on it's own and told Nvidia to do it? NOPE.. Nvidia developed it, and Microsoft working with Nvidia (Especially since they are XBOX partners) put the shaders into DX8. Microsoft ultimatly decides what goes into DX releases, but chipset manufactures have a large input in to what goes into DX. Microsoft just doesn't make stuff up and forces everyone to comply with the DirectX standard.

    By SlartyB March 06, 2001, 11:43 AM

    quote:Originally posted by Racer^:
    I believe this because in the past, many companies have released outrageous specs which everyone raved about until the product's release - when everyone found out that the product had to be slowed down in order to make it stable, etc.
    NVidia does NOT want this.

    How old are you ? No, I am not trying to insult you or be rude. It's just that you obviously don't remember the "nVidia-of-old". There was a time when nVidia would promise the world then deliver products that were *WAY* below their claimed performance. It is still happening to certain extent. Companies like 3dfx and ATI on the other hand were completely tight-lipped about performance / clock speeds until the card was practically on the store shelves.

    By sc5mu93 March 06, 2001, 12:14 PM

    just wondering, didnt 3dlabs offload the geometry onto the card a while back? given this isnt a mainstream card manufacturer, but i was wondering if Nvidia wasnt the first to do it.

    By Racer^ March 06, 2001, 01:21 PM

    quote:Originally posted by SlartyB:
    There was a time when nVidia would promise the world then deliver products that were *WAY* below their claimed performance.

    Thats what I was saying! You claim I don't remember when they delivered products below their claimed performance -but I said that because of these past experiences, NVidia decided to make sure that this would not happen again. I believe they decided to specifiy performance levels that were below or at levels they KNOW they can attain. It is because of this that I concluded there would be more room for improvement.

    By Heffe March 06, 2001, 05:30 PM

    This Topic is prety long, ain't it?

    By Subnova March 06, 2001, 05:33 PM

    what about those unified drivers i heard about a while ago?

    ~Share the knowledge~

    By heliosc March 06, 2001, 06:15 PM


    Oh, great inovation. They wait to see what the competition has, and then steals it. Thats very inovative. So whats wrong then with ATI doing the same thing???????? If Nvidia can, so why not and ATI???

    >>>>>>>>
    Given the fact that NV's cards are out 3 months before everyone else's, how can THEY be doing the stealing? NVIDIA is the single most innovative hardware company out there, no doubt about it. They got to the top purely by making great cards with awesome drivers at good prices. While ATI is getting better, it'll be awhile before I trust their rep as much as I do NVIDIA's.

    By OOAgentFiruz March 06, 2001, 07:25 PM

    "Given the fact that NV's cards are out 3 months before everyone else's, how can THEY be doing the stealing? NVIDIA is the single most innovative hardware company out there, no doubt about it. They got to the top purely by making great cards with awesome drivers at good prices. While ATI is getting better, it'll be awhile before I trust their rep as much as I do NVIDIA's."

    My vote for innovation goes to PowerVR's Kryo.

    By sc5mu93 March 06, 2001, 09:24 PM

    quote:Originally posted by OOAgentFiruz:
    My vote for innovation goes to PowerVR's Kryo.

    <sarcasm> No. the kyro cant be innovative because it addresses the problem of memory bandwith limitation </sarcasm>

    By Elxman March 06, 2001, 09:50 PM

    quote:Originally posted by heliosc:

    Oh, great inovation. They wait to see what the competition has, and then steals it. Thats very inovative. So whats wrong then with ATI doing the same thing???????? If Nvidia can, so why not and ATI???

    >>>>>>>>
    Given the fact that NV's cards are out 3 months before everyone else's, how can THEY be doing the stealing? NVIDIA is the single most innovative hardware company out there, no doubt about it. They got to the top purely by making great cards with awesome drivers at good prices. While ATI is getting better, it'll be awhile before I trust their rep as much as I do NVIDIA's.

    look at the radeon chip and the gf3 chip and see how many differences you can find.
    look at the radeon chip and the gf2 chip and see how many differences you can find.
    btw I do not think $550give or take $20 for a vid card is a "good price"


    By The Grinch March 06, 2001, 10:12 PM

    quote:Originally posted by heliosc:
    NVIDIA is the single most innovative hardware company out there, no doubt about it.

    You guys crack me up. How soon we forget that 3dfx was sueing nVidia up the ying yang? Two words. Patent Infringement. If 3dfx managed to survive, nVidia would be forking out royalties for stealing their ideas and we wouldn't be having this discussion.

    So there. I win.

    By 3dcgi March 07, 2001, 02:39 AM

    quote:Originally posted by sc5mu93:
    just wondering, didnt 3dlabs offload the geometry onto the card a while back? given this isnt a mainstream card manufacturer, but i was wondering if Nvidia wasnt the first to do it.

    Yes, 3dlabs and some others used geometry acceleration, but they used a second chip. Nvidia was the first to combine everything into a single chip.

    By 3dcgi March 07, 2001, 02:46 AM

    quote:Originally posted by heliosc:

    Oh, great inovation. They wait to see what the competition has, and then steals it. Thats very inovative. So whats wrong then with ATI doing the same thing???????? If Nvidia can, so why not and ATI???

    >>>>>>>>
    Given the fact that NV's cards are out 3 months before everyone else's, how can THEY be doing the stealing? NVIDIA is the single most innovative hardware company out there, no doubt about it. They got to the top purely by making great cards with awesome drivers at good prices. While ATI is getting better, it'll be awhile before I trust their rep as much as I do NVIDIA's.

    Innovation aside the best thing about nVidia is execution. It's amazing that those guys put out chip after chip with barely a slippage. The GeForce3 shipped a little late, but still came out before anyone else had a programmable chip.

    By Humus March 07, 2001, 09:10 AM

    quote:Originally posted by 3dcgi:
    Innovation aside the best thing about nVidia is execution. It's amazing that those guys put out chip after chip with barely a slippage. The GeForce3 shipped a little late, but still came out before anyone else had a programmable chip.

    Radeon is programmable. It's has programmable pixel shaders.

    By Humus March 07, 2001, 09:12 AM

    quote:Originally posted by heliosc:
    Given the fact that NV's cards are out 3 months before everyone else's, how can THEY be doing the stealing?

    What kinda fact should that be. It's all about how you see it. The GF2 was out perhaps 3 month before Radeon, on the other hand the was out Radeon like 5-6 months before GF3.

    By Drakh March 07, 2001, 10:37 AM

    quote:Originally posted by The Grinch:
    You guys crack me up. How soon we forget that 3dfx was sueing nVidia up the ying yang? Two words. Patent Infringement. If 3dfx managed to survive, nVidia would be forking out royalties for stealing their ideas and we wouldn't be having this discussion.

    So there. I win.

    What's your point?

    Do you people expect 3d-card maker to invent new ways of rendering 3d-graphics every time the release a new card?

    Sometimes I think that copyright laws need a rehash, preventing companies patenting to common things or things that they clearly have not invented.

    Read that Apple has been granted a patent on 'changing the aperence of the operating system on computers' i.e. skinning the OS. They hardly invented this feature, instead they patented it to prevent other companies(microsoft) from implenting it in their OS's.

    Same story with RamBust.

    In the end it hurts the end user.

    By Sol March 07, 2001, 10:39 AM

    wow, vide0 card arguments are great. Who cares who invented what. This is business and yes, if one company does something, the other will try to out do it. I remember when Nvidia was the underdog going u against 3dfx, now its the oposite. Competition is good so who cares who invented what. We get better cards in the end. I am sure NVidia took some stuff from ATI, and the oposite. oh well.. its early and I am tired.

    By andycap March 07, 2001, 12:18 PM

    as far as the 3dfx vs Nvidia patent stuff

    most of the case 3dfx presented was thrown out
    only a few complaints where demened valid for trail by a judge. though this did take years.

    and what was left was related to voodoo 1 riva 128 stuff.. very unlikely it would have changed any thing. As much as some ppl have hyped it.

    if 3dfx would have won.. (very unlikely if your where reading the follow ups and stuff)
    it wouldn't have just effected nvida it would have affectted. ati matrox s3 intel and eveyone and his brother who made vid cards would have been screwed. But since most of it was tossed outa court as a frivilas law suit. the court tried to get the to companies to settle sevral times because of the broad and out dated patent laws. Companies can throw legal power around to slow down the competition

    the us patent laws are awfull and for years all these companies have been suing each other. remeber nvida had a seprate case againt 3dfx as well. In fact 3dfx filed cases and suits agiant quite a few compaines i belive most where dismised out right? Wasn't there a prevois case 3dfx had agiasnt nvidai that got tanked as well

    ya I know spelling errors but i'm lazy

    By The Grinch March 07, 2001, 01:32 PM

    You guys don't get it. 3dfx tried to sue nVidia, nVidia was trying to sue 3dfx. Both companies claimed each other was steeling ideas. How can we make claims to who the real inovators are if the manufactures can't even figure it out? This is a battle that neither side will win. But I think we can all agree that 3dfx, nVidia, ATI, Matrox, etc, etc, all contributed something to the full experience we have today.

    Seriously people, this isn't a football game. You guys are way too emotional over this topic.

    GO HAVE A BEER AND RELAX DAMNIT!

    By [DWC]DarkWolf March 07, 2001, 03:03 PM

    You see me? YOU SEE ME!? NO! you don't (i hope) but i assure you i'm foaming at the mouth (ok, i'm lying, i'm not...) But i am holding out for the Radeon II. I have a Voodoo3 3500 and it's still right quick on my PIII800eb with 256 ram. when EB gets some Radeons in, i'll try one out. 'Till then, yummy in my tummy because i'm the only person i know who thinks power bars taste good. (yes i just ate one)

    By SlartyB March 07, 2001, 03:16 PM

    quote:Originally posted by Racer^:
    Thats what I was saying! You claim I don't remember when they delivered products below their claimed performance -but I said that because of these past experiences, NVidia decided to make sure that this would not happen again. I believe they decided to specifiy performance levels that were below or at levels they KNOW they can attain. It is because of this that I concluded there would be more room for improvement.

    Good!! Then we agree. I am sorry if I sounded acusatory - it was not my intention. It's just that your post was phrased in such a way that it gave the impression that nVidia
    strive to take the "high ground" wherease others do not. I was mearly trying to point out that nVidia have been pretty "liberal" with their performance claims in the past.

    By SlartyB March 07, 2001, 03:21 PM

    quote:Originally posted by Sol:
    ....Competition is good so who cares who invented what....

    The shareholders and employees of the companies care - and without them, there would be no graphics cards.

    By SlartyB March 07, 2001, 03:33 PM

    quote:Originally posted by andycap:
    as far as the 3dfx vs Nvidia patent stuff

    most of the case 3dfx presented was thrown out only a few complaints where demened valid for trail by a judge. though this did take years.

    Actually NO. That is a lie. All but one of 3dfx claims were upheld in the Markman hearing and every single one of nVidia's counter claims was dismissed. 3dfx were actually pressing for a summary judgement, but in the end it was all too late.

    quote:
    and what was left was related to voodoo 1 riva 128 stuff.. very unlikely it would have changed any thing. As much as some ppl have hyped it.

    Wrong again. The whole multi-texure patent infringement was centred around the TNT and TNT2 because of the multi-texturing. Do you know how many TNT2s nVidia has sold ?!?! If 3dfx had a royalty for every one nVidia sold - it would have changed everything.

    quote:if 3dfx would have won.. (very unlikely if your where reading the follow ups and stuff)
    it wouldn't have just effected nvida it would have affectted. ati matrox s3 intel and eveyone and his brother who made vid cards would have been screwed. But since most of it was tossed outa court as a frivilas law suit. the court tried to get the to companies to settle sevral times because of the broad and out dated patent laws. Companies can throw legal power around to slow down the competition

    the us patent laws are awfull and for years all these companies have been suing each other. remeber nvida had a seprate case againt 3dfx as well. In fact 3dfx filed cases and suits agiant quite a few compaines i belive most where dismised out right? Wasn't there a prevois case 3dfx had agiasnt nvidai that got tanked as well

    ya I know spelling errors but i'm lazy

    Your comments about the general malaise in the Patent system are very true, but your other comments are speculation or just plain wrong.

    By Sol March 07, 2001, 04:18 PM

    quote:Originally posted by SlartyB:
    The shareholders and employees of the companies care - and without them, there would be no graphics cards.


    You are correct on that, but I don't care who invented what. I was mainly referring to myself caring. I will buy the better product for the cheaper price in the end. and most of the consumers will also. Brand loyalty... hmmmm not the case for me. you will have some people who will live and die a company and more power to them.

    By SlartyB March 07, 2001, 04:42 PM

    quote:Originally posted by Sol:

    You are correct on that, but I don't care who invented what. I was mainly referring to myself caring. I will buy the better product for the cheaper price in the end. and most of the consumers will also. Brand loyalty... hmmmm not the case for me. you will have some people who will live and die a company and more power to them.

    Spoken like a true capitalistic consumer

    Whilst we are both correct in our statements, I would put it to you that your choice and options as a consumer are greatly diminished when one company causes the demise of another through the wrongful use of their technology. That - and the fact that people's livelihoods are at stake - is why there are laws against it.

    In a broader sense, it is short-sighted of consumers to think that they can have what they want without consequence. The consequences may not be immediately apparent, but that makes them no less real.

    By lostboy March 07, 2001, 04:50 PM

    Oiya... I saw this post from SlartyB coming:

    quote:Originally posted by SlartyB:
    Your comments about the general malaise in the Patent system are very true, but your other comments are speculation or just plain wrong.

    *Pats SlartyB on the back* ^_^ Dang I have to say that your dedication and loyalty are incredible. I'm sorry (both from the stand point of a consumer... in that I'd have loved to see the Rampage, and as sympathizer) that the 3Dfx deal ended as it did. I don't know what you are doing for a living, but with such dedication as you displayed in your extremely vigilant defense of 3Dfx, I expect they were lucky to get a hold of you. I wish you better luck in future persuits.

    -lostboy

    By The Grinch March 07, 2001, 05:08 PM

    quote:Originally posted by SlartyB:
    .....Your comments about the general malaise in the Patent system are very true, but your other comments are speculation or just plain wrong.

    Hehe. Slarty to the rescue.

    By the way, will you still eat your shorts and video tape it as proof if ATI beats out nVidia by the end of the year?

    By richardginn March 07, 2001, 05:34 PM

    when ATI says what the specs are, Nvidia might be in for a big suprise.

    By Triton March 07, 2001, 05:38 PM

    I just wish that they both can release the cards and then we'll have fact and end of discussion. Simple as that.

    By Drakh March 07, 2001, 06:14 PM

    quote:Originally posted by richardginn:
    when ATI says what the specs are, Nvidia might be in for a big suprise.

    I hope so. In the current situation thing doesn't look that good for ATI as they have no competitor to the GF3 yet. and if speculations are right not untill the 3:rd quarter.

    Guess wich company that is issuing a profit warning.

    By 3dcgi March 07, 2001, 07:22 PM

    quote:Originally posted by Humus:
    Radeon is programmable. It's has programmable pixel shaders.

    True. So did the GeForce2. I was actually thinking of the GeForce3 being the first chip to implement the full DX8 programmable pipeline. I just wasn't specific enough.

    By SlartyB March 07, 2001, 10:24 PM

    quote:Originally posted by The Grinch:
    Hehe. Slarty to the rescue.

    By the way, will you still eat your shorts and video tape it as proof if ATI beats out nVidia by the end of the year?

    DAMN!!! You have a long memory ...

    Ummmm, errrr, well that remains to be seen. Personally, I would *LOVE* to see ATI kick nVidia's ass. However, given their recent announcements about earnings and the fact that there is scant information on the web, I feel fairly comfortable I will have some shorts to wear for the summer

    ( Note to self : Be careful what you say - it may come back and bite you in the shorts )

    By SlartyB March 07, 2001, 10:37 PM

    quote:Originally posted by lostboy:
    Oiya... I saw this post from SlartyB coming:

    *Pats SlartyB on the back* ^_^ Dang I have to say that your dedication and loyalty are incredible. I'm sorry (both from the stand point of a consumer... in that I'd have loved to see the Rampage, and as sympathizer) that the 3Dfx deal ended as it did. I don't know what you are doing for a living, but with such dedication as you displayed in your extremely vigilant defense of 3Dfx, I expect they were lucky to get a hold of you. I wish you better luck in future persuits.

    -lostboy

    Thank you very much. It was an honour to work for 3dfx. Not only did I have the opertunity to work with a bunch of very smart, dedicated people and learn a lot, but it was fun too!

    Luckily, I have found another job in 3D graphics - but I can't tell you what it is yet, because I don't want to spoil the surprise

    Let's just say it's at the other end of the "spectrum" compared to 3D on a PC

    Oh - and one of the best parts is that there are a whole BUNCH of x3dfx engineers there (and NO, it's NOT nVidia).

    By Sol March 08, 2001, 08:01 AM

    quote:Originally posted by SlartyB:
    Spoken like a true capitalistic consumer

    Whilst we are both correct in our statements, I would put it to you that your choice and options as a consumer are greatly diminished when one company causes the demise of another through the wrongful use of their technology. That - and the fact that people's livelihoods are at stake - is why there are laws against it.

    In a broader sense, it is short-sighted of consumers to think that they can have what they want without consequence. The consequences may not be immediately apparent, but that makes them no less real.

    Well, I can go to the store and ask the person who purchases these cards and say, do you really want to buy that, they stole that from company B. Most of them wont care. Key factors are performance and price. I personally wish there was three to four really good cards out. Having NVIDIA and ATI just doesnt float my boat right now.

    Also, I am not your tpyical consumer. I had a voodoo 2 up until october of this year. I am a video card makers worst nightmare becuase I wont upgrade for minor performance. I have a GeForce 2 right now, at my rate, i will get the GeForce4 or 5, or the Radeon 4 or 5.

    By SlartyB March 08, 2001, 01:34 PM

    quote:Originally posted by Sol:
    Well, I can go to the store and ask the person who purchases these cards and say, do you really want to buy that, they stole that from company B. Most of them wont care. Key factors are performance and price. I personally wish there was three to four really good cards out. Having NVIDIA and ATI just doesnt float my boat right now.

    Also, I am not your tpyical consumer. I had a voodoo 2 up until october of this year. I am a video card makers worst nightmare becuase I wont upgrade for minor performance. I have a GeForce 2 right now, at my rate, i will get the GeForce4 or 5, or the Radeon 4 or 5.

    I agree entirely. I too upgrade my hardware once every couple of years or so (except video cards - which I got free until recently ).

    This isn't the place to have a deep philosophical debate over the vagaries of capitalism. I just think it is a sad reflection on society as a whole that when you get right down to the bottom line - people don't really care about anything except how good something is and how much it costs financially. I'm not just talking about video cards now - I mean
    *everything*. If people could - they would still buy ivory, even though it would probably mean the end of elephants, they currently buy tropical hardwoods (myself included) even though it results in the destruction of rainforest. I know video cards are not quite in the same league - but it is all the same to varying degrees. It's about time people woke up and smelled the coffee and realised that every action they take has a consequence, and some of those consequences are going to make life bloody awfull for some people in the future. Yes, yes - I know "capitalism has no concience", well maybe it's about time it grew up and got one.

    There, now I have that off my chest - we can get back to arguing about which is better - ATI or nVidia

    By Sol March 08, 2001, 03:02 PM

    heh, I hada celeron 300a oc to 450 with a voodoo2 as my main setup up until october. This time I might keep things up to date though since I am making some good $ now.

    By lostboy March 08, 2001, 04:45 PM

    quote:Originally posted by SlartyB:
    Thank you very much. It was an honour to work for 3dfx. Not only did I have the opertunity to work with a bunch of very smart, dedicated people and learn a lot, but it was fun too!

    No problem, I'm glad you enjoyed your job.

    quote:Luckily, I have found another job in 3D graphics - but I can't tell you what it is yet, because I don't want to spoil the surprise

    Woot! SlartyB has gone to work for bitboys and just won't admit it. ^_- So when are you going to release your great bandwidth GeForce3 killer?

    quote:Let's just say it's at the other end of the "spectrum" compared to 3D on a PC

    Nah, more like just a "bit" different from your old line of work.

    quote:Oh - and one of the best parts is that there are a whole BUNCH of x3dfx engineers there (and NO, it's NOT nVidia).

    So that is where the rest of you guys ended up! Come on and admit it and give us the breakdown on the bitboys revolutionary card that is supposed to be out this year.

    Teasing, teasing, Goodluck with the new job.

    -Lostboy

    By Un4given March 08, 2001, 05:29 PM

    SlartyB,

    I know there are a lot of things you can't talk about from your time with 3dfx, but you should be able to talk about the nature of the multitexturing part of the suit.

    What exactly was the nature of the suit? I know it can't be just multitexturing, since 3dfx didn't invent that. From what I understood it had to do with single pass multitexturing, right? If that is right, did NV directly copy 3dfx method of doing this, or did they actually have a different method of doing it, but 3dfx is claiming their patents covered any method of applying multiple textures in a single pass?

    Not slamming, just curious.

    By Humus March 08, 2001, 06:44 PM

    quote:Originally posted by Un4given:
    What exactly was the nature of the suit? I know it can't be just multitexturing, since 3dfx didn't invent that.

    3dfx did indeed invent multitexturing.

    By SlartyB March 08, 2001, 08:13 PM

    quote:Originally posted by Un4given:
    SlartyB,

    I know there are a lot of things you can't talk about from your time with 3dfx, but you should be able to talk about the nature of the multitexturing part of the suit.

    What exactly was the nature of the suit? I know it can't be just multitexturing, since 3dfx didn't invent that. From what I understood it had to do with single pass multitexturing, right? If that is right, did NV directly copy 3dfx method of doing this, or did they actually have a different method of doing it, but 3dfx is claiming their patents covered any method of applying multiple textures in a single pass?

    Not slamming, just curious.

    As Humus just said - yes, they did invent multitexturing.

    The patent covered applying multiple textures to a pixel in a single pass. That is, without the round-trip to the framestore that was typical before this technique came along.

    The only difference in the two implementations was that the texture units in the TNT could be seperated into two seperate single texture units, thus doubling pixel throughput for pixels that were only single textured. When combined serially to form a single unit that applied two textures to a pixel simultaneously, it was identical to the method outlined in 3dfx's patent. They (nVidia) didn't have a leg to stand on.

    It's not so much that nVidia "copied" 3dfx - more that they tried to use a technique that had already been patented (and was in use) by 3dfx. In this situation, you can either stop using the method, or pay the inventor (3dfx) a royalty for using the technique. nVidia opted to do neither of these things - which is why 3dfx took them to court.

    People don't seem to realise that to "invent" a technique can often take many man years to perfect it and involve a substantial finacial outlay on the part of the inventor. Why should someone who has not been through that "pain" be allowed to reap the benefits of that invention without paying the inventor for the privelege ? That is the essence of why we even have patents.


    Contact Us | www.SharkyForums.com

    Copyright (c) 1999, 2000 internet.com Corporation. All Rights Reserved.


    Ultimate Bulletin Board 5.46

    previous page
    next page




    HardwareCentral
    Compare products, prices, and stores at Hardware Central!


    Copyright 2002 INT Media Group, Incorporated. All Rights Reserved. About INT Media Group | Press Releases | Privacy Policy | Career Opportunities