Home

News

Forums

Hardware

CPUs

Mainboards

Video

Guides

CPU Prices

Memory Prices

Shop



Sharky Extreme :


Latest News


- Outdoor Life: Panasonic Puts 3G Wireless Into Rugged Notebooks
- Averatec Launches Lightweight Turion 64 X2 Laptop
- Acer Fires Up Two New Ferrari Notebooks
- Belkin Debuts Docking Station for ExpressCard-Equipped Notebooks
- Logitech 5.1 Speaker System Puts Your Ears At Eye Level
News Archives

Features

- SharkyExtreme.com: Interview with ATI's Terry Makedon
- SharkyExtreme.com: Interview with Seagate's Joni Clark
- Half-Life 2 Review
- DOOM 3 Review
- Unreal Tournament 2004 Review

Buyer's Guides

- September High-end Gaming PC Buyer's Guide
- September Value Gaming PC Buyer's Guide
- October Extreme Gaming PC Buyer's Guide

HARDWARE

  • CPUs


  • Motherboards

    - Gigabyte GA-965P-DS3 Motherboard Review
    - DFI LANPARTY UT nF4 Ultra-D Motherboard Review

  • Video Cards

    - Gigabyte GeForce 7600 GT 256MB Review
    - ASUS EN7900GT TOP 256MB Review
    - ASUS EN7600GT Silent 256MB Review
    - Biostar GeForce 7900 GT 256MB Review





  • SharkyForums.Com - Print: New Beta V5 Drivers - HSR? Maybe? Look like it?? Yea :-)

    New Beta V5 Drivers - HSR? Maybe? Look like it?? Yea :-)
    By jbirney November 29, 2000, 07:11 AM

    I noticed that 3dfx slipped us some new beta drivers that are suppose to give those lucky SOB Tribes2 beta tester some preformance improvements. Lookie what I found in the read me under the glide section:


    "Glide has received continuous updates in line with the latest released titles. See section 13.11 for specific application issues.
    Additional features:
    AA Toggle Key and Vertical Sync a new Hidden Surface Removal option see 3dfx tools for details.
    "


    This was in the glide section only..but it shows promise..I have yet to try them..getting late must sleep.....

    Get em here: http://www.3dfxgamers.com/view.asp?IOID=2494


    Wonder if any other games, cough UT cough, will see any improvements in the GLIDE HSR if it indeed works as we hope it might...hmmmm


    UPDATE:

    we have already seen reports roll in the 3dfx maessage board that the HSR is working and can give you up to a 25 FPS increase in fill rate limited conditions:
    http://www.3dfxgamers.com/boards.asp?BOID=5&THID=96022&boardsort=lastpost&boardpage=1


    Wowsers!

    By jbirney November 29, 2000, 07:15 AM

    here are some numbers from Sharkfood:


    Just off the top. Q3, 1024x768x32, all settings maxed. demo001
    HSR=0, 76.7 fps
    HSR=1, 91.2 fps
    HSR=2, 104.6 fps
    HSR=3, 108.7 fps
    HSR=4, 109.1 fps


    Please note that there were some artifacts with the HSR=4 setting. I am still down loading them...dam nice gain :-)

    By Doward November 29, 2000, 07:36 AM

    I like it... another reason to get a V5 over a GTS for me!

    By DesertCat November 29, 2000, 10:27 AM

    quote:Originally posted by jbirney:
    here are some numbers from Sharkfood:


    Just off the top. Q3, 1024x768x32, all settings maxed. demo001
    HSR=0, 76.7 fps
    HSR=1, 91.2 fps
    HSR=2, 104.6 fps
    HSR=3, 108.7 fps
    HSR=4, 109.1 fps


    Please note that there were some artifacts with the HSR=4 setting. I am still down loading them...dam nice gain :-)


    This is great news! Now for all of those board reviews using Q3 as the holy grail of benchmarks, I demand a recount!

    By DesertCat November 29, 2000, 10:31 AM

    Now the big questions become: 1) When will HSR be incorporated into the D3D drivers? 2) Does this HSR implementation use SSE and/or 3DNow!? 3) How does this scale across different CPUs, 4) When will 3dfx finalize these puppies and make them the official drivers?

    By jbirney November 29, 2000, 10:52 AM

    quote:Originally posted by DesertCat:
    Now the big questions become:

    1) When will HSR be incorporated into the D3D drivers?

    I do not know. At the highest HSR setting there was some tearing of the textures reported. So they are still working on them. I'd imagine they will make it over to D3D, but your guess is as good as mine


    quote:
    2) Does this HSR implementation use SSE and/or 3DNow!?

    No clue. I have seen post of about equal gains per the same CPU speed for athons and p3 users.

    quote:
    3) How does this scale across different CPUs,

    Here is where I think that we will see some bigger gains for the faster CPU. One AMD owner had a 1g CPU and got almost x2 FPS increase. Now I have no idea if those are legit numbers or not. But if I understand they way there are using HSR correctly, then the faster CPU may offer better FPS gains. We know that in fill rate limited conditions, the CPU has a lot of idle time.

    Speculation mode = 1;

    Here is where HSR drivers are working their magic, it is using these extra CPU cycles to help figure out what is hidden and what is not at the driver level. And with a faster CPU you have more idle time, or more time to figure out what should be drawn and what should not be drawn, then it dumps what it has to the V5 to draw. Of course I have not seen the source code of the drivers and trying to make a simple generalization that MAY explain why were get higher gains on faster CPU

    Speculation mode = 0;


    quote:
    4) When will 3dfx finalize these puppies and make them the official drivers?

    Hopefully they will get these in D3D and see if there is anything that can be done with the tearing. there are 4 levels of HSR, with the two highest have some tearing in Q3 on certain levels. There are only 9 fps difference in the top 3 settings. So you can run using the 3rd fastest HSR tearing, and get FPS only 9 fps slower than the fastest. I did not see any tearing in UT. But these drivers are less than 24 old for the public. Lets wait and see they are tested more and more :-)

    By DesertCat November 29, 2000, 11:33 AM

    Good comments Jbirney. I imagine that we will not know the full answers for a while on these, especially since they are beta. When they do become official, I expect 3dfx will release more of this kind of info.

    Hopefully the 3dfx guys will talk this stuff up with the technically oriented review sites when they get these dirvers polished. I know I'm curious.

    By Doward November 29, 2000, 01:12 PM

    Oh, yes, the V5 shall gain some ground here! How's it comparing to the GTS now? I think 3dfx has some more tricks up it's sleeve, and will perfect these drivers. Great work, 3dfx!

    By jbirney November 29, 2000, 01:23 PM

    Well over at Riva Station they have some prelimary bench marks...
    how dose 97.1 fps in q3 16x12x32 max settings sound you folks?? Ultra = 57.1 keep in mind that the tearing makes it unplayable for that bench mark, so the ultra wins that one....but still it shows that the theory works and has potential. I hear capping your FPS to some higher level will remove most of the tearing and give you almost a zero drop in FPS..... Not too shabby at all....

    http://www.rivastation.com/index_e.htm

    By Un4given November 29, 2000, 07:37 PM

    quote:Originally posted by jbirney:
    Well over at Riva Station they have some prelimary bench marks...
    how dose 97.1 fps in q3 16x12x32 max settings sound you folks?? Ultra = 57.1 keep in mind that the tearing makes it unplayable for that bench mark, so the ultra wins that one....but still it shows that the theory works and has potential. I hear capping your FPS to some higher level will remove most of the tearing and give you almost a zero drop in FPS..... Not too shabby at all....

    http://www.rivastation.com/index_e.htm

    I'm curious. Most testers turn VSync off to see just how high the fps can go. What would happen if VSync was re-enabled, since we all know that fps severely above the monitors refresh rate can cause the same problem. Now I'm not saying that 97 fps is severely over a monitors refresh rate, and may be lower than some of the better monitors, but I wonder if it would help.

    By Un4given November 29, 2000, 07:52 PM

    Now don't get me wrong, I'm not a hater or lover of either companies chipsets, but I do feel compelled to point something out.

    1. Since this is done by software in the drivers, Nvidia could just as easily to the same thing to the GeForce (ala FSAA).

    2. If the card is gaining that much ground with HSR, then the card was also severaly bandwidth limited, like the GeForce, so why in the hell did they opt for SDRAM rather than DDR?

    All in all this doesn't sound like a trick they pulled out of their sleeve so much as things that they didn't get done at the first to get the chips/boards out the door before they were any further behind than they already were.

    Also, could this be a little to little, a little too late with NV20 just around the corner? Now, before anyone blasts me for this post sound pro-Nvidia or anti-3dfx, I've already stated in another post that unless Nvidia does implement some kind of HSR in NV20, they may just as well call it a marketing hyped, overpriced GTS, because that's where the performance will be.

    By DesertCat November 29, 2000, 08:40 PM

    quote:Originally posted by Un4given:
    All in all this doesn't sound like a trick they pulled out of their sleeve so much as things that they didn't get done at the first to get the chips/boards out the door before they were any further behind than they already were.

    You are overlooking the possibility that the HSR part of these drivers may be taking advantage of some part of DirectX 8. I don't know if it is true or not, but it would help explain why HSR drivers haven't shown up until now.

    Could Nvidia build it into their drivers? Probably. Guess we will see.

    By jbirney November 29, 2000, 10:01 PM

    Hmm well there was a nice (about 10 fps bost) in q3 max setting with the Det 3 drivers a few months ago....

    Yes I am sure there could be further improvements in the DET3 drivers as far as FPS scores go..

    And they opted for SDR in order to attempt to keep the price down.

    My 800x600x32 x4 fsaa max everthing went from 31.1 fps to 65 fps with no artifacts...not too shabby at all :-)


    By Buttalova November 29, 2000, 10:40 PM

    Too bad you can't buy V5's for much longer hey?

    By Captain Iglo November 30, 2000, 05:55 AM

    i think this 'software emulation phase' is just a test: get HSR to work without visual artifacts, and offer HW support for Rampage or whatever it will be called.
    remember that 3dfx has the knowledge for multichip design and could easily install a HSR chip on their cards. Rampage + SAGE + HSR chip.

    just a thought.

    By Doward November 30, 2000, 06:21 AM

    Sweet... Imagine the V5-6000 with these drivers? Whoa... I'll bet that's why they didn't release that board. Once these drivers are get right, they won't need the extra two chips. Good thing they released these soon after announcing the demise of the V5-6000, eh?

    By Humus November 30, 2000, 07:46 AM

    quote:Originally posted by Un4given:
    1. Since this is done by software in the drivers, Nvidia could just as easily to the same thing to the GeForce (ala FSAA).

    True, but could take some time if they weren't prepared.

    quote:Originally posted by Un4given:
    2. If the card is gaining that much ground with HSR, then the card was also severaly bandwidth limited, like the GeForce, so why in the hell did they opt for SDRAM rather than DDR?

    HSR reduces the work load of the core and the memory in equal amounts. The core and memory load ratio isn't affected.

    By Humus November 30, 2000, 07:47 AM

    quote:Originally posted by DesertCat:
    You are overlooking the possibility that the HSR part of these drivers may be taking advantage of some part of DirectX 8. I don't know if it is true or not, but it would help explain why HSR drivers haven't shown up until now.

    Could Nvidia build it into their drivers? Probably. Guess we will see.

    HSR is in no way releated to DX8.

    By DesertCat November 30, 2000, 10:36 AM

    quote:Originally posted by Humus:
    HSR is in no way releated to DX8.

    Yeah, you are right. It was speculation on my part. From talking with people on 3dfxgamers it appears that the drivers only need DirectX 7. My bad.

    By Doward November 30, 2000, 03:43 PM

    Yeah, I like the approach 3dfx is taking, using the CPU for a lot of the operations now being used in the card itself. I mean, CPU's are easy to upgrade, and means the card can scale well with the cpu. Besides, I think 3dfx's promise of 1600x1200x32 at 60+ fps may become a reality soon. Imagine what the Rampage will do?

    By Un4given November 30, 2000, 04:48 PM

    quote:Originally posted by Humus:
    HSR reduces the work load of the core and the memory in equal amounts. The core and memory load ratio isn't affected.

    I understand that HSR reduces memory bandwidth by reducing the overdraw associated with the drawing and texturing of items in the scene that aren't visible to the player. My point was, if they know that bandwidth was a problem, why opt for SDR rather than DDR? Cost? Maybe, but DDR isn't that much more expensive.

    By jbirney November 30, 2000, 04:53 PM

    quote:Originally posted by Un4given:
    I understand that HSR reduces memory bandwidth by reducing the overdraw associated with the drawing and texturing of items in the scene that aren't visible to the player. My point was, if they know that bandwidth was a problem, why opt for SDR rather than DDR? Cost? Maybe, but DDR isn't that much more expensive.


    The VSA was oringally designed before the Geforce cards and before DDR became main stream. Back then it was a brand new technolgy, with a high price tag. 3dfx chose to use scalibility to fit memory bandwidth. Today, the GTS2 and the V5 have the same memory bandwidth even though the GF2 has the faster DDR. Scalibility has some advatages

    By Doward November 30, 2000, 05:04 PM

    How about a scaled DDR? Super bandwidth, yes, but sdr has a lower latency!

    By Un4given November 30, 2000, 06:26 PM

    quote:Originally posted by Doward:
    How about a scaled DDR? Super bandwidth, yes, but sdr has a lower latency!

    That's actually not true. The smaller MB chips are available in DDR SGRAM. That is why the the 32MB GTS cards actually performed slightly better in the lower resolutions than the first 64MB cards. The larger MB chips on the 64MB cards was a DDR SDRAM, and the 32MB cards were DDR SGRAM. The V5 uses 8 chips @ 8MB for 64MB. The GF2 cards use 4 chips at 16MB for 64MB total.

    By Doward November 30, 2000, 07:29 PM

    Well then, why don't they use more SGRam?

    By KeeperMarius November 30, 2000, 08:14 PM

    Well, I've got the Voodoo5 and just checked out the [beta] HSR drivers. Overall, they're really neat. I didn't see any artifacts at 1024x768x32, but I did get a good 20fps framerate boost, which is now in the 80's.

    First things first though, the higher resolution, the more artifacts. Low rez doesn't display ANY of this, but high rez does, and when you enabled FSAA it adds to the artifact list. It's rather funny how the 1600x1200x32 scores turn out with aggressive HSR enabled. And yes, it is actually playable, just A LOT of curruption. I got 55.6fps at 1600x1200x32 on my own custom timedemo. It's a bit more demanding and longer than the basic ones, but I feal it's a bit more accurate.

    Here are some examples of some of the HSR effects:

    1) Drastically higher framerate! Low rez, usually 1024x768 and below, don't suffer from much corruption, if any at all. Like I said, I didn't see any corruption at 1024x768x32 with everything maxed. But I did notice a [massive] fps boost.

    Curruption, when occurs:

    1) Feals like half the input is coming from your mouse.
    2) Missing textures, you can see objects on the other side of the wall when you're running around the corner.
    3) Ultra high resolution with low HSR skips frames. This is where even I think some of the reported framerates are bogus, because the framerate counter skyrockets when there are missing frames. Interesting.....

    Ultra high resolution with high HSR doesn't skip frames, but does give off a considerable amount of artifacts.

    ********************************************

    Why didn't 3dfx opt for DDR memory for the Voodoo5? I think there are several [major] reasons.

    1) The VSA-100 chip doesn't actually support DDR.
    2) Although bandwith limited, it isn't nearly as bandwith limited efficiency wise as the Geforce GTS.
    3) Not cost effective. Voodoo5's debued at $300 US. Incorporating DDR memory would have required an insane sale price, roughly $450+.
    4) I think it would offset the core to memory ratio. Remember, except for the Banshee, 3dfx runs their core and memory at the same speed. It's not like the Nvidia cards where you can overclock the core and memory seperately. I believe they do this to achieve greater stability, or something like that.

    ******************************************

    Ok, back to the HSR stuff. Will Nvidia probably do this? Oh yeah, I think so. No offense, but Nvidia seems to gobble up the best of the best from the market, and they've already talked about with the NV20, so I'm sure they're incorporate HSR also. We'll just have to see how well they do it.

    In the end though, this is great stuff from 3dfx. I'm sure they'll eventually get bugs down to a minimum, but the new betas show EXTREME promise.

    REMEMBER THOUGH, THEY'RE JUST BETA.

    -KeeperMarius

    By Humus November 30, 2000, 10:15 PM

    quote:Originally posted by Un4given:
    I understand that HSR reduces memory bandwidth by reducing the overdraw associated with the drawing and texturing of items in the scene that aren't visible to the player. My point was, if they know that bandwidth was a problem, why opt for SDR rather than DDR? Cost? Maybe, but DDR isn't that much more expensive.

    The thing is that it doesn't only reduce memory bandwidth needs, it's just that it reduces the amount of drawing, thus memory load and core load are still equally linked to each other as before.

    When we are talking about the HSR in tile based rendering the situation is quite different, but V5 doesn't have this.

    By Un|cO November 30, 2000, 11:47 PM

    Ok, while you guys r all interested in the v5, Rampage effects, do any of you guys think that 3dfx might include any HSR functions in their older product, driver update? Say V3? Could this be done? Are they gonna bother?

    By Un|cO November 30, 2000, 11:50 PM

    Could be interesting just to see if 3dfx can still give the old v2 and v3 some life again

    By Wedge December 01, 2000, 03:50 AM

    quote:Originally posted by Buttalova:
    Too bad you can't buy V5's for much longer hey?

    why can't we buy them for much longer?? or are you really that dense?

    By Doward December 01, 2000, 07:19 AM

    ::laughes!:: Oh yeah, the SLI V2 shall live again! It topped out at 1024x768, so it should artifact very much. I'd love to see a v2 sli put up a good fight with say, a geforce, or an mx.

    By Doward December 01, 2000, 09:26 AM

    Erm... should NOT

    By Un|cO December 01, 2000, 10:10 AM

    Ok maybe not the V2, but why not V3s? I mean, if you can do it, you might aswell

    By Un|cO December 01, 2000, 10:26 AM

    Well, well, check this out
    http://bansheedrivers.homestead.com

    HSR for banshees and V3?? maybe, just maybe


    Contact Us | www.SharkyForums.com

    Copyright 1999, 2000 internet.com Corporation. All Rights Reserved.


    Ultimate Bulletin Board 5.46

    SharkyForums.Com - Print: New Beta V5 Drivers - HSR? Maybe? Look like it?? Yea :-)

    New Beta V5 Drivers - HSR? Maybe? Look like it?? Yea :-)
    By jbirney November 29, 2000, 07:11 AM

    I noticed that 3dfx slipped us some new beta drivers that are suppose to give those lucky SOB Tribes2 beta tester some preformance improvements. Lookie what I found in the read me under the glide section:


    "Glide has received continuous updates in line with the latest released titles. See section 13.11 for specific application issues.
    Additional features:
    AA Toggle Key and Vertical Sync a new Hidden Surface Removal option see 3dfx tools for details.
    "


    This was in the glide section only..but it shows promise..I have yet to try them..getting late must sleep.....

    Get em here: http://www.3dfxgamers.com/view.asp?IOID=2494


    Wonder if any other games, cough UT cough, will see any improvements in the GLIDE HSR if it indeed works as we hope it might...hmmmm


    UPDATE:

    we have already seen reports roll in the 3dfx maessage board that the HSR is working and can give you up to a 25 FPS increase in fill rate limited conditions:
    http://www.3dfxgamers.com/boards.asp?BOID=5&THID=96022&boardsort=lastpost&boardpage=1


    Wowsers!

    By jbirney November 29, 2000, 07:15 AM

    here are some numbers from Sharkfood:


    Just off the top. Q3, 1024x768x32, all settings maxed. demo001
    HSR=0, 76.7 fps
    HSR=1, 91.2 fps
    HSR=2, 104.6 fps
    HSR=3, 108.7 fps
    HSR=4, 109.1 fps


    Please note that there were some artifacts with the HSR=4 setting. I am still down loading them...dam nice gain :-)

    By Doward November 29, 2000, 07:36 AM

    I like it... another reason to get a V5 over a GTS for me!

    By DesertCat November 29, 2000, 10:27 AM

    quote:Originally posted by jbirney:
    here are some numbers from Sharkfood:


    Just off the top. Q3, 1024x768x32, all settings maxed. demo001
    HSR=0, 76.7 fps
    HSR=1, 91.2 fps
    HSR=2, 104.6 fps
    HSR=3, 108.7 fps
    HSR=4, 109.1 fps


    Please note that there were some artifacts with the HSR=4 setting. I am still down loading them...dam nice gain :-)


    This is great news! Now for all of those board reviews using Q3 as the holy grail of benchmarks, I demand a recount!

    By DesertCat November 29, 2000, 10:31 AM

    Now the big questions become: 1) When will HSR be incorporated into the D3D drivers? 2) Does this HSR implementation use SSE and/or 3DNow!? 3) How does this scale across different CPUs, 4) When will 3dfx finalize these puppies and make them the official drivers?

    By jbirney November 29, 2000, 10:52 AM

    quote:Originally posted by DesertCat:
    Now the big questions become:

    1) When will HSR be incorporated into the D3D drivers?

    I do not know. At the highest HSR setting there was some tearing of the textures reported. So they are still working on them. I'd imagine they will make it over to D3D, but your guess is as good as mine


    quote:
    2) Does this HSR implementation use SSE and/or 3DNow!?

    No clue. I have seen post of about equal gains per the same CPU speed for athons and p3 users.

    quote:
    3) How does this scale across different CPUs,

    Here is where I think that we will see some bigger gains for the faster CPU. One AMD owner had a 1g CPU and got almost x2 FPS increase. Now I have no idea if those are legit numbers or not. But if I understand they way there are using HSR correctly, then the faster CPU may offer better FPS gains. We know that in fill rate limited conditions, the CPU has a lot of idle time.

    Speculation mode = 1;

    Here is where HSR drivers are working their magic, it is using these extra CPU cycles to help figure out what is hidden and what is not at the driver level. And with a faster CPU you have more idle time, or more time to figure out what should be drawn and what should not be drawn, then it dumps what it has to the V5 to draw. Of course I have not seen the source code of the drivers and trying to make a simple generalization that MAY explain why were get higher gains on faster CPU

    Speculation mode = 0;


    quote:
    4) When will 3dfx finalize these puppies and make them the official drivers?

    Hopefully they will get these in D3D and see if there is anything that can be done with the tearing. there are 4 levels of HSR, with the two highest have some tearing in Q3 on certain levels. There are only 9 fps difference in the top 3 settings. So you can run using the 3rd fastest HSR tearing, and get FPS only 9 fps slower than the fastest. I did not see any tearing in UT. But these drivers are less than 24 old for the public. Lets wait and see they are tested more and more :-)

    By DesertCat November 29, 2000, 11:33 AM

    Good comments Jbirney. I imagine that we will not know the full answers for a while on these, especially since they are beta. When they do become official, I expect 3dfx will release more of this kind of info.

    Hopefully the 3dfx guys will talk this stuff up with the technically oriented review sites when they get these dirvers polished. I know I'm curious.

    By Doward November 29, 2000, 01:12 PM

    Oh, yes, the V5 shall gain some ground here! How's it comparing to the GTS now? I think 3dfx has some more tricks up it's sleeve, and will perfect these drivers. Great work, 3dfx!

    By jbirney November 29, 2000, 01:23 PM

    Well over at Riva Station they have some prelimary bench marks...
    how dose 97.1 fps in q3 16x12x32 max settings sound you folks?? Ultra = 57.1 keep in mind that the tearing makes it unplayable for that bench mark, so the ultra wins that one....but still it shows that the theory works and has potential. I hear capping your FPS to some higher level will remove most of the tearing and give you almost a zero drop in FPS..... Not too shabby at all....

    http://www.rivastation.com/index_e.htm

    By Un4given November 29, 2000, 07:37 PM

    quote:Originally posted by jbirney:
    Well over at Riva Station they have some prelimary bench marks...
    how dose 97.1 fps in q3 16x12x32 max settings sound you folks?? Ultra = 57.1 keep in mind that the tearing makes it unplayable for that bench mark, so the ultra wins that one....but still it shows that the theory works and has potential. I hear capping your FPS to some higher level will remove most of the tearing and give you almost a zero drop in FPS..... Not too shabby at all....

    http://www.rivastation.com/index_e.htm

    I'm curious. Most testers turn VSync off to see just how high the fps can go. What would happen if VSync was re-enabled, since we all know that fps severely above the monitors refresh rate can cause the same problem. Now I'm not saying that 97 fps is severely over a monitors refresh rate, and may be lower than some of the better monitors, but I wonder if it would help.

    By Un4given November 29, 2000, 07:52 PM

    Now don't get me wrong, I'm not a hater or lover of either companies chipsets, but I do feel compelled to point something out.

    1. Since this is done by software in the drivers, Nvidia could just as easily to the same thing to the GeForce (ala FSAA).

    2. If the card is gaining that much ground with HSR, then the card was also severaly bandwidth limited, like the GeForce, so why in the hell did they opt for SDRAM rather than DDR?

    All in all this doesn't sound like a trick they pulled out of their sleeve so much as things that they didn't get done at the first to get the chips/boards out the door before they were any further behind than they already were.

    Also, could this be a little to little, a little too late with NV20 just around the corner? Now, before anyone blasts me for this post sound pro-Nvidia or anti-3dfx, I've already stated in another post that unless Nvidia does implement some kind of HSR in NV20, they may just as well call it a marketing hyped, overpriced GTS, because that's where the performance will be.

    By DesertCat November 29, 2000, 08:40 PM

    quote:Originally posted by Un4given:
    All in all this doesn't sound like a trick they pulled out of their sleeve so much as things that they didn't get done at the first to get the chips/boards out the door before they were any further behind than they already were.

    You are overlooking the possibility that the HSR part of these drivers may be taking advantage of some part of DirectX 8. I don't know if it is true or not, but it would help explain why HSR drivers haven't shown up until now.

    Could Nvidia build it into their drivers? Probably. Guess we will see.

    By jbirney November 29, 2000, 10:01 PM

    Hmm well there was a nice (about 10 fps bost) in q3 max setting with the Det 3 drivers a few months ago....

    Yes I am sure there could be further improvements in the DET3 drivers as far as FPS scores go..

    And they opted for SDR in order to attempt to keep the price down.

    My 800x600x32 x4 fsaa max everthing went from 31.1 fps to 65 fps with no artifacts...not too shabby at all :-)


    By Buttalova November 29, 2000, 10:40 PM

    Too bad you can't buy V5's for much longer hey?

    By Captain Iglo November 30, 2000, 05:55 AM

    i think this 'software emulation phase' is just a test: get HSR to work without visual artifacts, and offer HW support for Rampage or whatever it will be called.
    remember that 3dfx has the knowledge for multichip design and could easily install a HSR chip on their cards. Rampage + SAGE + HSR chip.

    just a thought.

    By Doward November 30, 2000, 06:21 AM

    Sweet... Imagine the V5-6000 with these drivers? Whoa... I'll bet that's why they didn't release that board. Once these drivers are get right, they won't need the extra two chips. Good thing they released these soon after announcing the demise of the V5-6000, eh?

    By Humus November 30, 2000, 07:46 AM

    quote:Originally posted by Un4given:
    1. Since this is done by software in the drivers, Nvidia could just as easily to the same thing to the GeForce (ala FSAA).

    True, but could take some time if they weren't prepared.

    quote:Originally posted by Un4given:
    2. If the card is gaining that much ground with HSR, then the card was also severaly bandwidth limited, like the GeForce, so why in the hell did they opt for SDRAM rather than DDR?

    HSR reduces the work load of the core and the memory in equal amounts. The core and memory load ratio isn't affected.

    By Humus November 30, 2000, 07:47 AM

    quote:Originally posted by DesertCat:
    You are overlooking the possibility that the HSR part of these drivers may be taking advantage of some part of DirectX 8. I don't know if it is true or not, but it would help explain why HSR drivers haven't shown up until now.

    Could Nvidia build it into their drivers? Probably. Guess we will see.

    HSR is in no way releated to DX8.

    By DesertCat November 30, 2000, 10:36 AM

    quote:Originally posted by Humus:
    HSR is in no way releated to DX8.

    Yeah, you are right. It was speculation on my part. From talking with people on 3dfxgamers it appears that the drivers only need DirectX 7. My bad.

    By Doward November 30, 2000, 03:43 PM

    Yeah, I like the approach 3dfx is taking, using the CPU for a lot of the operations now being used in the card itself. I mean, CPU's are easy to upgrade, and means the card can scale well with the cpu. Besides, I think 3dfx's promise of 1600x1200x32 at 60+ fps may become a reality soon. Imagine what the Rampage will do?

    By Un4given November 30, 2000, 04:48 PM

    quote:Originally posted by Humus:
    HSR reduces the work load of the core and the memory in equal amounts. The core and memory load ratio isn't affected.

    I understand that HSR reduces memory bandwidth by reducing the overdraw associated with the drawing and texturing of items in the scene that aren't visible to the player. My point was, if they know that bandwidth was a problem, why opt for SDR rather than DDR? Cost? Maybe, but DDR isn't that much more expensive.

    By jbirney November 30, 2000, 04:53 PM

    quote:Originally posted by Un4given:
    I understand that HSR reduces memory bandwidth by reducing the overdraw associated with the drawing and texturing of items in the scene that aren't visible to the player. My point was, if they know that bandwidth was a problem, why opt for SDR rather than DDR? Cost? Maybe, but DDR isn't that much more expensive.


    The VSA was oringally designed before the Geforce cards and before DDR became main stream. Back then it was a brand new technolgy, with a high price tag. 3dfx chose to use scalibility to fit memory bandwidth. Today, the GTS2 and the V5 have the same memory bandwidth even though the GF2 has the faster DDR. Scalibility has some advatages

    By Doward November 30, 2000, 05:04 PM

    How about a scaled DDR? Super bandwidth, yes, but sdr has a lower latency!

    By Un4given November 30, 2000, 06:26 PM

    quote:Originally posted by Doward:
    How about a scaled DDR? Super bandwidth, yes, but sdr has a lower latency!

    That's actually not true. The smaller MB chips are available in DDR SGRAM. That is why the the 32MB GTS cards actually performed slightly better in the lower resolutions than the first 64MB cards. The larger MB chips on the 64MB cards was a DDR SDRAM, and the 32MB cards were DDR SGRAM. The V5 uses 8 chips @ 8MB for 64MB. The GF2 cards use 4 chips at 16MB for 64MB total.

    By Doward November 30, 2000, 07:29 PM

    Well then, why don't they use more SGRam?

    By KeeperMarius November 30, 2000, 08:14 PM

    Well, I've got the Voodoo5 and just checked out the [beta] HSR drivers. Overall, they're really neat. I didn't see any artifacts at 1024x768x32, but I did get a good 20fps framerate boost, which is now in the 80's.

    First things first though, the higher resolution, the more artifacts. Low rez doesn't display ANY of this, but high rez does, and when you enabled FSAA it adds to the artifact list. It's rather funny how the 1600x1200x32 scores turn out with aggressive HSR enabled. And yes, it is actually playable, just A LOT of curruption. I got 55.6fps at 1600x1200x32 on my own custom timedemo. It's a bit more demanding and longer than the basic ones, but I feal it's a bit more accurate.

    Here are some examples of some of the HSR effects:

    1) Drastically higher framerate! Low rez, usually 1024x768 and below, don't suffer from much corruption, if any at all. Like I said, I didn't see any corruption at 1024x768x32 with everything maxed. But I did notice a [massive] fps boost.

    Curruption, when occurs:

    1) Feals like half the input is coming from your mouse.
    2) Missing textures, you can see objects on the other side of the wall when you're running around the corner.
    3) Ultra high resolution with low HSR skips frames. This is where even I think some of the reported framerates are bogus, because the framerate counter skyrockets when there are missing frames. Interesting.....

    Ultra high resolution with high HSR doesn't skip frames, but does give off a considerable amount of artifacts.

    ********************************************

    Why didn't 3dfx opt for DDR memory for the Voodoo5? I think there are several [major] reasons.

    1) The VSA-100 chip doesn't actually support DDR.
    2) Although bandwith limited, it isn't nearly as bandwith limited efficiency wise as the Geforce GTS.
    3) Not cost effective. Voodoo5's debued at $300 US. Incorporating DDR memory would have required an insane sale price, roughly $450+.
    4) I think it would offset the core to memory ratio. Remember, except for the Banshee, 3dfx runs their core and memory at the same speed. It's not like the Nvidia cards where you can overclock the core and memory seperately. I believe they do this to achieve greater stability, or something like that.

    ******************************************

    Ok, back to the HSR stuff. Will Nvidia probably do this? Oh yeah, I think so. No offense, but Nvidia seems to gobble up the best of the best from the market, and they've already talked about with the NV20, so I'm sure they're incorporate HSR also. We'll just have to see how well they do it.

    In the end though, this is great stuff from 3dfx. I'm sure they'll eventually get bugs down to a minimum, but the new betas show EXTREME promise.

    REMEMBER THOUGH, THEY'RE JUST BETA.

    -KeeperMarius

    By Humus November 30, 2000, 10:15 PM

    quote:Originally posted by Un4given:
    I understand that HSR reduces memory bandwidth by reducing the overdraw associated with the drawing and texturing of items in the scene that aren't visible to the player. My point was, if they know that bandwidth was a problem, why opt for SDR rather than DDR? Cost? Maybe, but DDR isn't that much more expensive.

    The thing is that it doesn't only reduce memory bandwidth needs, it's just that it reduces the amount of drawing, thus memory load and core load are still equally linked to each other as before.

    When we are talking about the HSR in tile based rendering the situation is quite different, but V5 doesn't have this.

    By Un|cO November 30, 2000, 11:47 PM

    Ok, while you guys r all interested in the v5, Rampage effects, do any of you guys think that 3dfx might include any HSR functions in their older product, driver update? Say V3? Could this be done? Are they gonna bother?

    By Un|cO November 30, 2000, 11:50 PM

    Could be interesting just to see if 3dfx can still give the old v2 and v3 some life again

    By Wedge December 01, 2000, 03:50 AM

    quote:Originally posted by Buttalova:
    Too bad you can't buy V5's for much longer hey?

    why can't we buy them for much longer?? or are you really that dense?

    By Doward December 01, 2000, 07:19 AM

    ::laughes!:: Oh yeah, the SLI V2 shall live again! It topped out at 1024x768, so it should artifact very much. I'd love to see a v2 sli put up a good fight with say, a geforce, or an mx.

    By Doward December 01, 2000, 09:26 AM

    Erm... should NOT

    By Un|cO December 01, 2000, 10:10 AM

    Ok maybe not the V2, but why not V3s? I mean, if you can do it, you might aswell

    By Un|cO December 01, 2000, 10:26 AM

    Well, well, check this out
    http://bansheedrivers.homestead.com

    HSR for banshees and V3?? maybe, just maybe

    previous page
    next page





    Copyright 2002 INT Media Group, Incorporated. All Rights Reserved. About INT Media Group | Press Releases | Privacy Policy | Career Opportunities