3dfx Financials & Demise?

By Steve Gibson, Nov 18, 1999 8:49pm PST Not a whole lot going down for the evening, if you're extra bored you can check out the 3dfx financial report as well as this VE article talking about the supposed demise of 3dfx. It's more of that T&L chest beating stuff which will certainly entertain the hardware buffs out there. My 2 cents, no single graphics card will dominate the market for 2 years anymore the market is a different place now. To draw such radical conclusions from the Voodoo4/5 announcement just boggles my mind. Of course that was the intent of the article to generate a response though eh? While we are on the 3fdx kick, Beyond3D has an explanation for just what the heck that Intel chip is people were wondering about in this picture of the Voodoo5 from Scott Sellers:

The purpose of the chip, according to Scott, 'is to isolate the AGP bus from the 4 chips on the V5 6000.  We need electrical isolation on the design, otherwise we would have too heavily loaded an AGP bus slot.  The Intel part serves as a bridge between the AGP bus and the 4 rendering engines...'

Oh, also Paul Steed sent me a few more videos for you guys showing off himself doing motion capture and then how it was plugged into a Quake3 character. Really pretty darn interesting. Gotta wait for the FTP situation to stabilize on that one though so we'll just shoot for tomorrow on that stuff.

update Jack Want my opinion on that article?  Too bad, you're getting it anyway :-)  For one, I love how press is really taking the T&L issue and running with it.  Is it important?  Yes.  Is it necessary yet?  That remains to be seen.  Until the technologies converge, developers will still have to write for people without T&L for awhile now, and that's not going to change.  At the same time, developers can't just cater to only a T-Buffer environment either.  And they simply don't have time to implement lots of either.

At the same time, 3dfx is going for some pretty high price points, but in the end that doesn't really matter either, as they're apparent stunning success in retail has still left them losing money overall.  The real issues about 3dfx at the moment have nothing to do with technology or retail pricing strategy, it has simply to do with the internals of the company trying to find and maintain a profitable business model in an industry where they're no longer the darling.  If technology were indeed the limiting factor, ATI would not be winning this war handily.  If retail really mattered right now, then 3dfx would be out on top.

But then again, there is the point brought up in the article about the juggernaut that is Glaze3D.  And that, my loyal readers, was sarcasm.

Click here to comment...


68 Threads | 68 Comments

  • Well... I have V2 SLI and I don\'t use it for current-gen games (Eg Q3A) I use my TNT2 Ultra. I only use the V2 SLI for \"old\" games, eg, games that are based on technology that was current when Voodoo2 was new tech, such as Half-Life and even older racing games like POD and Motorhead... So I don\'t really see the V2 as being \"viable\" now... sure it works, but if you are looking for a good-great gaming experience, you aren\'t realistically going to be using a Voodoo2 SLI for current tech games, or for the games of \"tomorrow\"

    If you had a V5 6000 now, I would guess it would be useful this time in 2000, and usable through to this time 2001 - Not to the degree that V2 SLI is now, but still servicable

    And I know that T&L and fill rate are (mostly) seperate things... I guess what I was trying to say was that for a supposed \"next gen\" card, the GeForce doesn\'t really give that much more fill rate over a TNT2 Ultra... I want to be able to play at greater than 1024x768.... and the V5 cards are going to let me do that, more so than the GeForce will.

    Or to put it another way, if I smack all the detail settings right up in Q3, the T&L on the GeForce will give me far more detail at lower resolutions without overloading the CPU, and so without dropping the framerate too much.... but I still cannot play at high resolutions (And still have a very good FPS (60-80+)) because of the low fill rate of the GeForce.

    The V5 series will let me play at high res with good framerate, but without the detail.

    So what I\'m really saying is that we need to see GPix+ fill rate with good T&L.. ;-)

  • Monk, the SLI was kick ass when it came out, and it\'s still adequate today, three years after its introduction to the market.

    Three years from now, would you be able to say the same about the V5 6000? Would you even be able to keep it a single year? Its lack of hardware transform and lighting is going to seriously hurt it in the long run.

    Even though you may be able to upgrade to a fast processor, no general purpose consumer level CPU will be able to attain the speed of a processor specifically designed to do graphics. Not even a 2 GHz Athlon will have the same floating point power as the T&L unit on the current generationg GeForce. The processing power is there, mountains of it. The problem is that current supporting hardware, most notably the agp 2x bus, is not fast enough and game software not well optimized enough to take advantage of a T&L card\'s full potential. As the market moves to AGP 4x and games, such as Halo, begin to utilize T&L in an appreciable way, the T&L deficiency of Napalm based cards will become more glaring.

    It is also important to note that fillrate has little to do with the ability to output a high number of polygons. Yes, a higher poly count might demand more fillrate because of overdraw, but a truly well written game would be able to minimize overdraw while using much of the potential of the GeForce. Nvidias demos, some of which pushes around 4 million polygons per second, demonstrate this fact.

    At this point it might be helpful to explain what overdraw is and how it relates to fillrate.

    The number of pixels shown per frame at any resolution can be calculated very easily by multiplying the horizontal resolution by the vertical resolution. So the number of pixels shown on a 1024 x 768 screen is just 1024*768=786432 pixels. In an ideal situation with no overdraw, if you want to render 1024 * 768 at 60 frames per second, you\'ll have to have a fillrate of 1024*768*60=47185926 pixels per second, which is 47.185929 MPixels.

    This is where overdraw comes in. Because of the way 3d software is written, for every on screen pixel, the 3d card running a 3d game usually have to render several unseen pixels. This is wasteful and inefficient, but also unavoidable. Overdraw is a measure of how many total pixels are rendered for a single pixel displayed.

    Q3 has approximately 3.5x overdraw. That means that for every pixel displayed, the video card has to render 3.5 pixels. So now we have to revise our fillrate requirement upwards. To do this, we simply multiply by 3.5. 47.2 Mpixels * 3.5 = 165 MPixels. Now let\'s suppose that each pixel is painted with 2 textures. This translates to 2*165=330 MTexels.
    Comapre this number to the GeForce\'s fillrate of 480 Mtexes/s and 480 Mpixels/s. It is important to note that no video card will be able to attain it\'s claimed peak fillrate under realworld circumstances.

    Now, where does a higher number of polygons come into play in this picture? The answer is that a higher poly count might increase overdraw. If you have a bunch models obscuring many things in your field of view, for example, you\'ll have higher overdraw than when you don\'t. But if you design your levels and write your engine to minimize overdraw, this shouldn\'t be as much of a problem. For example, you won\'t get nearly as much overdraw when you use T&L to render a bunch of realistic looking hills and mountains than when you try to render a forest with several hundred trees.

    Currently, the GeForce is not limited by fillrate except a resolutions higher than 1024*768. The greatest limiting factors on this card are on board memory bandwidth, which should be alleviated by the introduction of DDR, and agp bus bandwidth, which should be eased by agp 4X. Both, however, will remain a problem, as will processor speed, which is the current limiting factor at lower resolutions. Remember, a cpu has to perform many functions besides geometry setup, so even if a pc\'s onboard video card has blazing fast T&L, the cpu will still be a bottleneck. The difference is that the bottleneck will show up at higher framerates.

    To truly see that the GeForce is not fillrate limited, you can go to www.fastgraphics.com. The crazy mofos there used a peltier cooler to overclock their GeForce to 165 MHz core speed, which translates to an astounding 660 MPixel 660 Mtexel/s fillrate! The resulting speed increase in q3, however, was rather disappointing, with the reason being that the sdr memory the cpu and the agp bus were simply not fast enough to keep up with the GeForce. It should, however, be interesting to see how well a ddr card would do under the same circumstances.

  • I don\'t see why everyone is being so reactionary about the so-called high price:


    And look at the specs of V2 SLI vs V5 6000...

    Ok, with that out of the way, if you look at the prices of the other cards, they compete pretty well with GeForce and (kind of) with Rage Fury MAXX, so that really isn\'t an issue here IMHO... besides, if you got the green you KNOW you will take a good look at these things.. ;-)

    So what of T&L ?? Right now only nVidia have T&L silicone out there in the shops now, and frankly the GeForce doesn\'t have the fill-rate to back up its T&L... and as sCary pointed out a while back, you can set Quake3 to use the maximum detail on any video card... if you have a massively powerful CPU, it will to an extent \"make up\" the difference (AMD K7 at 1 GHz anyone ??) Considering the Voodoo 4/5 and 1GHz AMD (And probably 1 GHz Intel) are going to all come out around late Q1 or early-to-mid Q2 2000, makes in interesting thought.

    T&L is definately important, and it is not really as bad as the 16 vs 32 BIT \"scandal\" in the sense that you can compensate for lack of T&L (To an extent) by having some serious CPU... Right now, all the games will still benifit from massive fill rate.... T&L is not going to get you 1024x768 (Or above) at 100+ FPS.... and personally I can live with playing Q3 at say 1280x960 with current poly counts, vs Q3 at 1024x768 with double/triple current poly counts.... for the moment.

    What I\'m hoping 3dfx have up there sleeve is a T&L engine for around Q4 2000 / Q1 2001, with their usual fill rate increase. That\'s kinda what they need right now... for my money, the fill rate will buy them time until then... but if T&L takes off as many seem to think it will, 3dfx will NEED T&L by that time (End 2000).

  • Recycling a product makes sense, upgrading manufacturing processes, memory types, speeds, etc.. To increase performance for INTERIM products makes sense.

    It is the degree to which you recycle that can put you in the perverbial crosshairs. I was in fact a long time supporter of 3Dfx, and bought a V1 when they were 299$. Recently I got a TNT2 Ultra. Why? Well after nearly 4 years, 3Dfx was (and will be) using the same architecture, which really, is dated.

    It wasn\'t a year ago when they were saying that developers didn\'t really want 32 bit color, or better resolution depth buffers, or larger texture support, etc.. etc.. Now all the sudden, developers do? But now none of them needs T&L, but in a year? Or whenever they release a product that has T&L, You\'ll need it!

    Sorry, you can\'t just tack on more of the same old crappy chips, which still have a limiting architecture, and call yourself the best. 3Dfx took the early lead, but failed to innovate and push forward, and now they wallow in their ineptness.

    Who here doesn\'t think that Nvidia or any manufacturer couldn\'t just rework their designs to let them tack chips together and come up with a card that is just as fast for a similar price (likely with more features)?

  • Gee, people in love with themselves...just like me :D

    #52, I wub j00 2.

    I do remember nvidia\'s Derek Perez saying in an interview that the GeForce is a new design from the ground up. The reason why it works with TNT2 drivers is that it retains the same interface as the TNT.

    Though the GeForce looks to be the most compelling 4th generation video card, in actuality I\'m not satisfied with the performance of any of the cards coming out, so I\'m gonna wait a few more months, at which time I\'ll probably also upgrade my mobo, processor, memory and harddrive.

    #52, I made some conjectures and pulled a few facts that sounded reasonable enough that you felt compelled to call me brainwashed. Are you not smart enough to come up with your own speculations instead of insulting others? Oh, wait, that\'s not really fair. Cuz the GeForce and others are here already :D Guess you\'ll just have to keep on spewing personal insults then, which are kinda amusing in any case :)

  • Um, #49... yes the technology is a derivative of the Voodoo Graphics Core, much the same way that the GeForce256\'s chip architecture is based on the TNT2 architecture. It\'s so sad, that you Nvidia fanboys are so brainwashed, you don\'t even understand what you are talking about anymore.

    Same thing refers to some of you people\'s comments regarding putting chips in SLI mode.
    It is a TECHNIQUE to increase fill-rates by putting chips in parallel. Parallel processing has always been a viable method of pushing out power out of computer hardware, in SGI workstations, as well as the rise of dual-CPUs.

    We all remember how you Nvidia fanboys failed to admit, that the GeForce256 is BASICALLY, just coupling TWO TNT2 raster engines, and combining that with a hardware geometry engine. What does that mean to gamers? It demonstrates that hardware is not revolutionary at every stage, it is \"evolutionary\".
    You hypocritical company-loyal dumbasses need to check your brians for a moment.

    And yet another example of your company-loyal hypocritical stupidity, is demonstrated in that a few months ago, you were telling the world that the TNT2 was superior, since gamers demanded high-resolution gaming, with 32-bit color... but with the release of the GeForce, and T&L hardware capabilities, you have now gone against your former beliefs, and you are now stating that gamers demand frame-rate and nothing else (which is what you critized 3dfx of doing months ago).

    All you Nvidia fanboys, and other company-loyal dumbasses need to grow a brain.

  • And straight from www.3dfxgamers.com:

    3dfx: Is the VSA-100 Engine, the chip that powers the Voodoo4 and Voodoo5, still based on the original Voodoo Graphics core,
    or is it an all new design?

    S: ItÂ’s somewhere in between. The VSA-100 is still based on the Voodoo Graphics core, for compatibility reasons, but weÂ’ve
    made significant changes to the overall design. We had to build in support for 32-bit color, 2 pixels per clock rendering, new
    texture modes, new combine modes, tremendous scalability improvements, support for texture compression, larger texture sizes
    and a number of other new features. At heart, itÂ’s the Voodoo core, with lots of substantial changes. As a result, we consider it
    to be really close to a new architecture.


  • Motion blur does not look like the mouse-trail effect like many people think. It only appears so in a screen-shot... the same way that taking a photograph of a moving object reveals a blurred image, while watching it in real life, denotes a smooth movement.

    The motion blur in the V5 will NOT appear as mouse-trail effects, but rather, will serve to make choppy animation (which runs at around 30 animations per second) smoother, and will make detailed structures and models, easier to follow at long distances. While the motion appears smooth and fluid, taking a real-time freeze at any point, will reveal the actual effect that is being done during each frame.