Home

News

Forums

Hardware

CPUs

Mainboards

Video

Guides

CPU Prices

Memory Prices

Shop



Sharky Extreme :


Latest News


- Outdoor Life: Panasonic Puts 3G Wireless Into Rugged Notebooks
- Averatec Launches Lightweight Turion 64 X2 Laptop
- Acer Fires Up Two New Ferrari Notebooks
- Belkin Debuts Docking Station for ExpressCard-Equipped Notebooks
- Logitech 5.1 Speaker System Puts Your Ears At Eye Level
News Archives

Features

- SharkyExtreme.com: Interview with ATI's Terry Makedon
- SharkyExtreme.com: Interview with Seagate's Joni Clark
- Half-Life 2 Review
- DOOM 3 Review
- Unreal Tournament 2004 Review

Buyer's Guides

- September High-end Gaming PC Buyer's Guide
- September Value Gaming PC Buyer's Guide
- October Extreme Gaming PC Buyer's Guide

HARDWARE

  • CPUs


  • Motherboards

    - Gigabyte GA-965P-DS3 Motherboard Review
    - DFI LANPARTY UT nF4 Ultra-D Motherboard Review

  • Video Cards

    - Gigabyte GeForce 7600 GT 256MB Review
    - ASUS EN7900GT TOP 256MB Review
    - ASUS EN7600GT Silent 256MB Review
    - Biostar GeForce 7900 GT 256MB Review





  • SharkyForums.Com - Print: NVIDIA Industry Interview

    NVIDIA Industry Interview
    By alexross January 17, 2001, 05:58 PM

    Hi there folks,

    Over the past few months, the Forums have grown tremendously and become a really useful resource not only for you the users, but also for us as well. We would very much like to thank you for all your participation, questions and answers to so many technical issues. Your feedback has been very much appreciated and often been implemented in one form or another on the main www.sharkyextreme.com site. So keep ‘em coming.

    Today sees us school of Sharks kick off a new feature over here on www.sharkyforums.com and again we invite you to participate even further. Once in a while we will be starting a thread dedicated to generating an interview with an industry company executive, PR person or general know-it-all.

    The first company to go under your hammer is NVIDIA Corporation. Although we recently posted an interview with NVIDIA (see the interview here) many of you had questions that were left unanswered. Whether it be about the company's recent acquisition of 3dfx, Xbox, DirectX 8 and/or NV20, fire away. This is your chance folks. Just post your questions by replying to this topic (you can't start a new topic). We will pick 20 of the best questions, submit them to the folks over at NVIDIA, then post the answers to your questions here and on www.sharkyextreme.com.

    Thanks for taking part and for supporting us, as you have done so for two and a half years. Once again, thank you for flying with us, we know you have a choice….

    By Blizzard January 17, 2001, 06:32 PM

    Hi.
    I actually have 2 questions for the interview and they are as follow :

    1. What will your next generation graphic card cost ?

    2. When will your next generation graphic card be released ?

    -Blizzard

    By NEwBoY January 17, 2001, 07:57 PM

    1.will NVIDIA be constantly trying to push current memory and GPU architechture to the limit, or will NVIDIA be developing newer, smarter and more scalable technology relating to memory bottlenecks and such, an example being the BitBoys Glaze 3D product and its embedded DRAM, XBA memory system and multiple rendering pipelines?

    2.will NVIDIA focus more on pushing speed and fillrate, or now turn their attention towards newer image-enhancement techniques that greatly improve image quality such as FSAA?

    thanx

    By nkeezer January 17, 2001, 08:07 PM

    ATi has stated that digital TV (and by extension, HDTV) on the PC is still anywhere from 3-6 months away. Is NVIDIA developing any type of digital television tuner, and if so, what kind of time frame are we looking at before it comes to market?

    Now that NVIDIA has effectively eliminated most of the competition in the high-end graphics arena (excluding ATi), will there be an official shift away from the hyperkinetic 6 month cycle in favor of a more reasonable 9 or 12 month cycle, especially considering the ever-increasing prices of graphics boards?

    By Ymaster January 17, 2001, 08:41 PM

    http://webcast.mediaondemand.com/xbox/20010106/06_clip1_keynote_28100.asx

    In this Windows Media clip above is the X-Box when it was first Unveiled.

    My questions are:

    Is the Demo computer that runs at 1/5th the speed. Really just a fast computer with the NV20 GPU VideoCard?

    If the above is true about the DEMO running computer. Then should we expect to see are NV20 video cards running at about the same speed as the Demo when the new card is launched?

    Thankyou..Ymaster

    By CajnDave January 17, 2001, 09:46 PM

    How long until Nvidia can take full advantage of AGP 4X?

    By email_atif January 17, 2001, 10:48 PM

    Wassup SharkyForum Readers,

    Truthfully, I have been a very loyal and avid fan of 3dfx in the past. Upon hearing of the sale of core 3dfx assets to nVidia, I was disappointed to see the company I'd respected/valued for so long eaten up by nVidia. I realize that the future of 3dfx now lies within nVidia, so my question pertains to certain nVidia acquisitions as a result of the 3dfx deal. Will nVidia implement 3dfx T&L and Cinematic effects in their next product line (Obviously not in the NV20, but perhaps at a later date?) nVidia also acquired the 3dfx licenses pertaining to names of products as well if I am not mistaken. My second question is, does nVidia plan on using the Voodoo name and taking advantage of the fan base associated with it, in a future product, possibly the Value Market video cards? I appreciate this opportunity, and I thank Sharky for this gift.

    Late

    By slipgun January 18, 2001, 12:28 AM

    Here's my question: Didn't you guys buy 3dfx's "Rampage" chipset plans? Do you plan to implement any designs from Rampage into your future chipsets?

    By Sol January 18, 2001, 08:24 AM

    Great Idea.

    Here is my question, if you check out the XBox debates in the General Games Forum, how will the XBox graphics chip be different then the pc graphics chip?

    By Duo January 18, 2001, 02:53 PM

    1. Why did you decide that this was the right time to jump into the mac market.

    2. Will you continue to produce mac compatible cards in the near future (ie. GF2U, NV20). Will the mac versions be released at the same time as their PC counterparts.

    Duo

    By Mr. Silver January 18, 2001, 07:23 PM

    My question:

    What are your plans for the production of nVidia technology in Macs?

    Also:

    Will the nv20 (or any future chipsets) implement any type of the recently acquired 3dfx technology?

    By AMD_Forever January 18, 2001, 08:10 PM

    1. Now that FSAA has been shown to have too great a performance hit in it's current implementation, does Nvidia plan to improve upon the process or eventually abandon support for the feature? Other than to compete with 3dfx on a feature set scale, did Nvidia have any real desire to see FSAA implemented?
    2. How long will it be until Nvidia is able to correct hardware incompatibilities with non-intel chipsets, VIA inparticular? Nobody would have blamed you in the day of the Socket 7, but AMD platforms are becoming more mainstream and If Nvidia graphics chips aren't running correctly on every PC worldwide, thats a bad thing, isn't it? I love dandy new Nvidia features, especially when I get to go into the BIOS and disable every single one just because I have a Via chipset.
    3. How long do you think it will be until you implement 3dfx technology? What specific parts do you plan to implement first? Did you just buy 3dfx to eliminate one more enemy?

    That is all.
    Hail the fatherland............errr Nvidia!!

    By lmpulse January 18, 2001, 11:43 PM

    my question.
    how soon will nvidia start to incorporate 3dfx technology into their products? i.e. Hidden Surface Removal,SLI and Motion blur.

    By mazeikabedes January 19, 2001, 12:20 AM

    my question is.
    The new powermacs have the Geforce 2 Mx in them. As Nvidia makes reference boards, i was wondering who made the actual PCB for the macintosh computers? Is it apple themselves as most motherboard companies make graphic cards and apple makes its own motherboards?

    By brYceMaN January 19, 2001, 01:24 AM

    My Questions:

    What did you do with those Matrox engineers that you poached?

    How much of your company do you predict that Matrox will own after the lawsuit is settled?

    By Dirk Diggler January 19, 2001, 01:34 AM

    What's the go with the new NVIDIA chipset - NV20? Will it be an entirely new architecture or will it just be an advancment on the current GeForce chipset? Will there be support for AGP 8x when it comes out and what will be the benifits?

    By yyforever January 19, 2001, 02:25 AM

    quick question:

    Voodoo5 6000 ?

    By flerchin January 19, 2001, 03:09 AM

    1. When will the "ugly sky" bug shown on anandtech.com, and others, for q3a be ironed out of s3tc for geforce based cards?

    2. When will the corrupted textures that occur in midtown madness with geforce based cards be addressed?

    By kwilliam January 19, 2001, 03:15 AM

    Will the NV20 feature the TwinView feature that is present on Geoforce2MX cards?

    When will the Geoforce2Go notebook cards will be coming out? About how fast will they run? like a tnt2, Geoforce 2MX? How much ram can it have? Will it be just on intel notebooks or will any be paired up with the upcoming notebooks with AMD processors?

    Years ago, nobody thought you guys could take down 3dfx for the best performaning chip, can Nvidia take down ATI by getting more oem deals that were not possible such as in Macs and upcoming notebooks?

    Will there be any future products that will have a pci interface for those who are stuck with onboard video with only open pci slots as a only option?

    By cQc*one January 19, 2001, 03:38 AM

    will the nv20 have an internal cache or embedded ram? - if not - so why the nv20 does have 60 million transistors even if the half number of transistors would be enough for what the nv20 is capable of?

    are you (nvidia) sure that users are willing to pay $450 - $500 for a "standard" graphics card? and how much will the pro and standard version of the card cost?

    will the 2d part of the nv20 be better then the one of the geforces? (the image was a bit blury above 1024*768) and will there be better hardware-video-decoding? (just like motion compensation and such things we see on ati-cards)

    By Unicron January 19, 2001, 03:52 AM

    My question is why does your top of the line video card cost approx. the same as the top of the line processors out their? Is this because of your video card monopoly or because they are more expensive to make?

    By madcran January 19, 2001, 04:09 AM

    One question. When will Microsoft buy out NVIDIA and rule the world?

    By El_Panzon January 19, 2001, 04:45 AM

    When is your mainboard chipset coming out? Can you give final specs? How does it differ from the one used in the XBox?

    By Nagorak January 19, 2001, 04:53 AM

    1) Does Nvidia realize that hardly anyone is willing to pay $500 for a graphics card? I know many people who are "hardcore gamers", myself included, and none of them are seriously (or even jokingly) considering dropping $500 for a graphics cards.

    2) When did Nvidia get the idea to try charging this much? $200-$250 has been the 'norm' for a new graphics card for several years now. Now all of a sudden with the release of the Geforce 2 Ultra, Nvidia seems to have developed the notion that $500 is a good price to charge for a card!

    3) Why is it that the X-BOX, which has a pseudo-NV20 in it is only going to cost $300 total, when the NV20 itself is going to cost $500?

    4) Does Nvidia realize that if the ATI Radeon II (or any other card released by any other company) offers performance anywhere NEAR that of the NV20 and if it costs less than $300 (which it most likely will), then they are screwed and will make no sales?

    By Nagorak January 19, 2001, 04:56 AM

    5) If the NV20 costs so much not because Nvidia is trying for bigger profit margins, but because of the great expense for high speed DDR memory, when is Nvidia going to go back to the drawing board and develop smarter technology that is actually affordable?!!

    By Olive January 19, 2001, 05:14 AM

    Do you plan to participate in the development of a higher level API than OpenGL and Direct3D immediate mode, with scene graph management such as Peformer or game engines like Quake or Unreal.

    By dngrmous January 19, 2001, 06:22 AM

    I recently read an article on Digit-Life that professed to be an in-depth features analysis of the upcoming NVidia NV20 GPU.

    I was somewhat disappointed to read that NVidia's only plans to help alleviate the ever present problem of memory bandwidth limitations that all current 3D accelerators face was to simply use even faster DDR SDRAM in combination with some fancy Z-buffer tricks to help conserve memory bandwidth (ala ATI's HyperZ). While a brute force solution (i.e. even faster DDR SDRAM) is nice and may indeed work, it's not very elegant. Also, while fancy Z-buffer compression and early polygon culling techniques are a welcome addition, these are not terribly innovative features in the sense that they've already been implemented by other manufacturers.

    To be honest, I was hoping to see a new, perhaps revolutionary solution to this long standing problem. Ever since hearing about the upcoming NVidia "Crush" motherboard chipset last November, with it's dual-channel DDR SDRAM memory interface I thought an elegant, innovative and revolutionary solution to the memory bandwidth problem had finally arrived. If implemented on a video card, a dual-channel 128-bit DDR SDRAM memory interface could provide the same bandwidth as a 256-bit DDR SDRAM memory interface! This would go a long way towards eliminating the infamous bandwidth bottleneck, so adding this feature to the new NV20 seemed obvious. I was genuinely surprised when I didn't see it listed on the spec sheet.

    Although it's not in the current specs for the NV20, how difficult do you think it would it be to implement a dual-channel memory architecture (similar to that of the Crush chipset) on one of NVidia's 3D accelerators?

    By kflood January 19, 2001, 06:42 AM

    Compared to other video cards I've owned, NVidia cards have poor 2D clarity at high res/high refresh rates. I've owned a TNT2 Ultra, a GeForce and now a GeForce 2, but none of them equaled the 2D quality I got from cheaper ATI and Voodoo3 graphics cards. Does NVidia plan to do anything to improve 2D output? (Some of us have to do work on our systems as well as playing games.)

    Regards,

    Kevin.

    By eMpTy January 19, 2001, 06:55 AM

    Has the acquisition of 3dfx and their massive experience with sli and recently the vsa-100 design made a multiple chip nvidia solution any more viable. Or does nvidia just not want to make a card with 2 chips.

    By jvadgaard January 19, 2001, 08:02 AM

    I've been reading through the posts here, and all questions (including the ones I didn't even know I wanted to ask!) I myself would like to ask have been posted. In my opinion, the most interesting questions are on the following topics:

    ·What are nVidia plans for improving texture compression on existing cards?

    ·Considering that 400 - 500 $ looks out of reach for most customers, will anything be done to help reduce the price of new cards, or is it economically the best choice to keep prices this high?

    ·In terms of future technologies, what measures will be taken to improve on the fillrate bottleneck? Is dual-channel memory being considered?

    ·What new technologies are being researched for future products?

    Vadgaard

    By joaco1728 January 19, 2001, 09:35 AM

    WILL NVIDIA INCORPORATE SLI TO THAIR GEFORCE GRAPHICS CARD? CAN YOU IMAGE THE POTENTIAL THERE IS IN THAT, BECAUSE THE ACTUAL CHIP IN VODOO 5 WAS NOT AT ALL IMPRESSIVE, BUT WITH SLI IN THE GEFORCE YOU COULD MAKE A DIFFERENCE, AND YOU CAN MAKE MOTION BLUR AND THE T-BUFFER REAL.
    IF YES, WHEN COULD WE SEE A CARD WITH IT?

    By dburrell January 19, 2001, 10:14 AM

    I have a two part question.

    When will you add hardware FSAA, and Matrox style bumb mapping? These features have been requested in the past but it seems NVidia and ATI have taken the position of the consumer wont miss it. I do miss it, so this impacts the amount of dollars I am willing to spend on Video Card updates. Please note I have not upgraded my gforce256 sdr.

    By Azuth January 19, 2001, 10:25 AM

    What steps is Nvidia taking to address memory bandwidth(like ATI's hyperZ)? Will we see any of these features in the NV20?

    By AMD_Forever January 19, 2001, 10:34 AM

    quote:Originally posted by joaco1728:
    WILL NVIDIA INCORPORATE SLI TO THAIR GEFORCE GRAPHICS CARD? CAN YOU IMAGE THE POTENTIAL THERE IS IN THAT, BECAUSE THE ACTUAL CHIP IN VODOO 5 WAS NOT AT ALL IMPRESSIVE, BUT WITH SLI IN THE GEFORCE YOU COULD MAKE A DIFFERENCE, AND YOU CAN MAKE MOTION BLUR AND THE T-BUFFER REAL.
    IF YES, WHEN COULD WE SEE A CARD WITH IT?


    someone ban this idiot


    Contact Us | www.SharkyForums.com

    Copyright © 1999, 2000 internet.com Corporation. All Rights Reserved.


    Ultimate Bulletin Board 5.46

    previous page
    next page




    HardwareCentral
    Compare products, prices, and stores at Hardware Central!


    Copyright © 2002 INT Media Group, Incorporated. All Rights Reserved. About INT Media Group | Press Releases | Privacy Policy | Career Opportunities