PlanetCrap 6.0!
Front Page (ATOM) • Submission Bin (2) • ArchivesUsersLoginCreate Account
You are currently not logged in.
T O P I C
Walk Into Wall. Bleed. Repeat.
September 4th 2000, 04:09 CEST by andy

NVIDIA owners -- rejoice! Epic have learned from their past mistakes.

Owners of other video cards -- sigh in disbelief. Epic are going to make the same mistake again...



Epic's Unreal engine was originally designed with 3DFX cards in mind, with its hardware rendering being based around the Glide API. This was because 3DFX cards were expected to be the de facto standard for gaming. It didn't work out that way, though, and as many other 3D cards became popular, so Epic had to adjust their engine to work with Direct3D and OpenGL. Although things got better with Unreal Tournament, and have continued to improve with patching, people still complain of poor (by comparison) frame rates under Direct3D, especially on NVIDIA cards.

So do you think Epic have learned by now that it's a bad idea to target specific cards? (Cue the 'Crap masses: No!)

Believe it or not -- but you really should believe it because it's true -- Epic have decided to focus their next engine on a specific manufacturer's cards. Again. This time it's NVIDIA cards, as announced in this press release:

NVIDIA Corporation and Epic Games Inc. today announced a strategic partnership aimed at producing next generation applications that take full advantage of NVIDIA's latest 3D technology, including the new features of the GeForce 256(TM) and GeForce2 graphics processing units (GPUs).

"Epic's latest game, Unreal Tournament, is one of the most popular games in the world," says Sanford Russell, senior director of partner management at NVIDIA. "We see this as an opportunity to work more closely with Epic Games to ensure current and future Unreal Engine products run their absolute best on NVIDIA hardware."

[...]

"NVIDIA's dominant position in the PC market and their selection as the graphics platform provider for Xbox, clearly make them the most important graphics vendor from a business standpoint. But even more importantly, they're first in bringing to market the kind of advanced technology we intend to build our future products around, and that's what really drives us from a design standpoint," says Tim Sweeney, founder and lead programmer for Epic Games Inc.

Amazing, isn't it? Epic put themselves through a world of pain by focussing on one manufacturer when they believed it was going to dominate the market, and now four years later they're doing the same thing for the same reason. Gee, I wonder if it will all go horribly wrong...

C O M M E N T S
Home » Topic: Walk Into Wall. Bleed. Repeat.

|«« - Previous Page - Next Page - »»|
#1 by "Show Timw"
2000-09-04 04:12:43
bmw@carolina.rr.com
Let's get this over with.
#2 by "None-1a"
2000-09-04 04:13:48
none1a@home.com http://www.geocities.com/none-1a/
Some how I see this as nothing more then supporting Nvidia's hardware t&l and the perpixel shaders, not really a try to get it running the best on Nvidia hardware thing.

--
None-1a.

O forget it.<I><B></B></I><I></I><I></I>
#3 by "Show Time"
2000-09-04 04:15:26
bmw@carolina.rr.com
Now let's fix my name.
#4 by "Baytor"
2000-09-04 04:26:11
baytor@yahoo.com http://www.geocities.com/baytor
<quote>
Believe it or not -- but you really should believe it because it's true -- Epic have decided to focus their next engine on a specific manufacturer's cards. Again. This time it's NVIDIA cards, as announced in this press release:
<quote>
NVIDIA Corporation and Epic Games Inc. today announced a strategic partnership aimed at producing next generation applications that take full advantage of NVIDIA's latest 3D technology, including the new features of the GeForce 256(TM) and GeForce2 graphics processing units (GPUs).
</quote></quote>
Maybe I'm just being difficult, but are you sure this means that they're going to focus on the NVIDIA technology to the detrement of other cards?  Just because they're going to make sure that NVIDIA works as well possible on their new engine, doesn't mean they're going to screw themselves by not having Glide and/or D3D support.

I'll be curious to see what happens when the folks from EPIC make their inevitable appearance.

I... AM BAYTOR!!!!<I><B></B></I><I></I><I></I>
#5 by "Chris Johnson"
2000-09-04 04:29:52
If anything, I think this will help silence the people who keep trying to claim Epic is in the pocket of 3dfx.  Maybe not the best way to silence them to be sure, but... *shrug*

On another note, it's definitely cool to be setting things up to use advanced geometry processing tech and all, but how painful/feasible will it be for licensees to incorporate these changes into their projects..especially those that have been in development for a while?  Ford knows, there are definitely quite a few out there.  In all honesty, I'm hoping it will be relatively easy and available to all of the licensees out there, but code merges are I'm sure more and more evil the farther and farther a product is into development.  

(This preceeding is not intended to be a slam in any ways to Epic or the licensees, especially since I worked for one... I'm actually quite interested in this, seeing as I'm looking forward to a few of the licensees' games.  Just so you know.)
#6 by "Jeremy"
2000-09-04 04:32:10
jnthornh@eos.ncsu.edu
I'm wondering how much should really be read into that...

My personal take is that it's more to placate Nvidia users than to actually define a course where Epic focuses primarilly on Nvidia cards.

Considering that Nvidia has been the Unreal underachiever in the past, this just means that they plan to give some extra attention to the cards to get performance up to speed.

Who knows but Epic at this point... but as Andy said, focusing only on a few cards was obviously a huge mistake in the past.  I personally think (hope?) that Epic has learned from it, and so am not really taking that press release to mean doom and gloom for 3dfx or other card owners.

Jeremy
--
Despite your efforts to be a romantic hero, you will gradually evolve into a postmodern plot device. <I><B></B></I><I></I><I></I>
#7 by "Andy"
2000-09-04 04:33:09
andy@planetcrap.com http://www.meejahor.com/
<b>#4</b>, Baytor:
<QUOTE>
Maybe I'm just being difficult, but are you sure this means that they're going to focus on the NVIDIA technology to the detrement of other cards?
</QUOTE>
Yes and no.

Yes, I think they're going to make sure that NVIDIA cards get the absolute best performance. (It says as much in the press release, so either we have to say NVIDIA/Epic are lying, or we have to accept it.)

But no, I don't think they're going to deliberately worsen performance on other cards, it's just going to be unavoidable. (Otherwise, what would be the point of the agreement between Epic and NVIDIA? If all cards are going to perform to their maximum potential, this agreement would mean absolutely nothing.)
#8 by "[@~]MizuGami"
2000-09-04 04:38:40
mizugami@rochester.rr.com http://www.komatose.com
While I agree that designing a game around a specific video chip is not necessarily the best decision, I am glad that Epic has decided to go with the VASTLY superior Nvidia cards over the 3DFX cards.

I have to disagree though, Andy, I don't think Epic necessarily "put themselves through a world of pain" by focusing on the 3DFX platform. The patches have made UT much better, and I am able to run UT in OpenGL mode much better than when it ran in D3D. The only other real contenders out in the market right now (besides Nvidia) is the Radeon and 3DFX. Everyone knows 3DFX is the baldheaded stepchild of the group, and the Radeon hasn't been tested enough to give solid conclusions.

Either way, developers will make their games, the way they want to. If we want to play them bad enough, we will either suffer through D3D or buy new hardware to compensate.

[@~]MizuGami  <I><B></B></I><I></I><I></I>
#9 by "Andy"
2000-09-04 04:39:11
andy@planetcrap.com http://www.meejahor.com/
BTW, we've got another Epic topic coming up soon. So if the Epic boys turn up here and start flaming me, and then another Epic topic pops up tomorrow, that's just the way it goes, it's not revenge. Not that the next topic is in any way critical of Epic, but just so you know.
#10 by "Apache"
2000-09-04 04:41:12
apache@voodooextreme.com http://www.voodooextreme.com
nvidia does not have a proprietary API like glide, just D3D or OpenGL, which is supported by every 3D card. this is just pr crapola.
#11 by "Baytor"
2000-09-04 04:42:29
baytor@yahoo.com http://www.geocities.com/baytor
<b>#7</b> The Pagan God of Lust, "Andy" wrote:  
<QUOTE>But no, I don't think they're going to deliberately worsen performance on other cards, it's just going to be unavoidable. (Otherwise, what would be the point of the agreement between Epic and NVIDIA? If all cards are going to perform to their maximum potential, this agreement would mean absolutely nothing.)
</QUOTE>

As I see it engine developers have three setups they need to optimize their games for.

1)  3Dfx cards:  EPIC already has already optimized one engine for these cards, so I'm sure they'll continue to make sure their voodoo support is top-notch.  I don't think Voodoo card owners have much to worry about.

2)  NVIDIA:  Which they're going to be working on for this engine, and I have faith they'll nail it like they did the 3Dfx support.

3)  D3D:  The catch-all for other chipsets.  I'm sure they'll be working on it, but I don't they'll be doing a monolithic focus on this.  How good it'll be is anyone's guess.

So, I would expect the new engine to work extremely well on both 3Dfx and NVIDIA chipsets, with D3D support being a wildcard.

And as soon as the EPIC folk circle the wagons, I'm sure you'll hear exactly this from them, although they'll probably talk up the D3d support :)

I... AM BAYTOR!!!!<I><B></B></I><I></I><I></I>
#12 by "Baytor"
2000-09-04 04:46:35
baytor@yahoo.com http://www.geocities.com/baytor
<b>#7</b> The Pagan God of Lust, "Andy" wrote:  
<QUOTE>Yes, I think they're going to make sure that NVIDIA cards get the absolute best performance. (It says as much in the press release, so either we have to say NVIDIA/Epic are lying, or we have to accept it.)
</QUOTE>

Ummm, they're saying that it's going to make sure the new engine takes full advantage of the technology of NVIDIA's products.  That doesn't preclude that they won't take full advantage of Voodoo's products, although it is possible that there might be conflicts between optimizes for the two products, and in such a case I'm sure NVIDIA will win out because of the potential size of the X-Box market, but I doubt Voodoo owners will be as pissed off as GeoForce owners are now.

I... AM BAYTOR!!!!<I><B></B></I><I></I><I></I>
#13 by "Matthias Worch"
2000-09-04 04:48:03
mworch@legendent.com http://www.langsuyar.com
Unreal accesses NVidia cards via D3D. There are no manufacturer specific settings in D3D, and there's no way to program video card specific effects into D3D (3DFX and Glide are different, since Glide is a propietary API). Glide vs D3D = HUGE difference, and one that shouldn't be ignored (as Andy does so elegantly in his topic). Of course the press release is written in a "Yay, NVidia users rejoice" tone, it's the only way to convince the masses that the Unreal/NVidia problems are a thing of the past. In the end it doesn't mean anything, though - NVidia knows that other D3D cards will automatically profit from this deal, but that's something they can accept, since they still own over 50% of the market and get all the buzz.

This deal will help getting the D3D performance of Unreal as good as it gets, and ALL D3D cards will profit from it (as long as they got a T&L unit, that is). Making Unreal better for NVidia cards means that it will automatically become better for all other D3D cards. Everybody should be very happy about this, but I'm sure some people will find enough hair-splitting techniques to make this a 300+ post thread ;)

<I><B></B></I><I></I><I></I>
#14 by "Chris Johnson"
2000-09-04 04:49:52
Apache (10):
<quote>nvidia does not have a proprietary API like glide, just D3D or OpenGL, which is supported by every 3D card. this is just pr crapola. </quote>

Except for the nVidia-specific things like their T&L engine, their version of FSAA, etc etc etc.  So while the API itself may not be proprietary, there are definitely some things that can be coded specifically for nVidia chipsets/proprietary technology.
#15 by "Dethstryk"
2000-09-04 04:50:26
dethstryk@damagegaming.com http://www.damagegaming.com/
This is *good.* The end. (I also am an NVIDIA freak, which means all of my video cards have a chip by those suckers.)


--
Dethstryk
Damage Gaming
#16 by "Christopher Tew"
2000-09-04 04:53:49
kingmob@intermind.net
Of course, this probably has more to do with the X-Box than anything else.  ^_^
#17 by "Chris Johnson"
2000-09-04 04:54:21
Matthias:

Are the T&L unit, etc, a specifically nVidia thing?  Even if increased D3D support does help other cards/chipsets as well, wouldn't the code for thsi sort of thing still be more of an advantage to nVidia?

Just wondering, beause I always understood that to be their own thing, not a generic setup that anyone could theoretically use.
#18 by "Matthias Worch"
2000-09-04 04:57:38
mworch@legendent.com http://www.langsuyar.com
#14:

<quote>Except for the nVidia-specific things like their T&L engine, their version of FSAA, etc etc etc. So while the API itself may not be proprietary, there are definitely some things that can be coded specifically for nVidia chipsets/proprietary technology.</quote>

Not really, the only (valid) way to feed the data to the T&L chip is through regular D3D commands, anyway, and every other card with a T&L unit will be able to "intercept" and use them just as well.
And FSAA is done on a driver level/chip, anyway, there isn't anything that the game can/has to do to make it work (which is why the V5 FSAA automatically works with every game).

<I><B></B></I><I></I><I></I>
#19 by "Warren Marshall"
2000-09-04 04:59:25
warren@epicgames.com http://www.epicgames.com
<b>Andy</b> (#9):
<QUOTE>BTW, we've got another Epic topic coming up soon. So if the Epic boys turn up here and start flaming me, and then another Epic topic pops up tomorrow, that's just the way it goes, it's not revenge. Not that the next topic is in any way critical of Epic, but just so you know.
</QUOTE>

Heh, cool.  :)

<b>Matthias Worch</b> (#13):
<QUOTE>This deal will help getting the D3D performance of Unreal as good as it gets, and ALL D3D cards will profit from it (as long as they got a T&L unit, that is). Making Unreal better for NVidia cards means that it will automatically become better for all other D3D cards. Everybody should be very happy about this, but I'm sure some people will find enough hair-splitting techniques to make this a 300+ post thread ;) </QUOTE>

Huzzah!!

--

Warren Marshall - Professional Nuisance<I><B></B></I><I></I><I></I>
#20 by "Lumberjack"
2000-09-04 05:00:31
joek@pckconsult.com
Methinks andy now has a new whipping doll.....Epic.


----------------------
I used to be conceited, but now I am perfect.<I><B></B></I><I></I><I></I>
#21 by "Matthias Worch"
2000-09-04 05:01:17
mworch@legendent.com http://www.langsuyar.com
Chris: Nah, the Radeon already has a T&L unit, and the next Voodoo cards are supposed to get one, as well. Of course the capabilities of each chip might differ, but that should just result in increased/decreased performance. So for cards with half-assed T&L implementations you might have to turn down a few details. Pretty much like having a slow CPU.

<I><B></B></I><I></I><I></I>
#22 by "Lumberjack"
2000-09-04 05:07:26
joek@pckconsult.com
I'm just wondering if this means that nVidia will actually implement funtional palletized texture support for the geforce line.  If they do, and Unreal engine performance gets reliablly good because of it, I just may ditch my V5 5500.


----------------------
I used to be conceited, but now I am perfect.<I><B></B></I><I></I><I></I>
#23 by "Mad_Dog"
2000-09-04 05:29:51
markyork@cox-internet.com
Hmmmff. This is a moot topic.

Let's do some time travel, 'kay?

Way back when, when Unreal was first being written, there were no 3d accelerator cards. Then, there WAS a 3d card, or rather a chipset. 3dfx. Have you forgoten? Nothing could touch the performance of a VooDoo back then. So Unreal was updated/recoded to take advantage of what was at the time, the supreme 3d card. The way you did it back then was to write for Glide, since DirectX/Direct3D was very flaky. At the time, coding your entire engine for Glide seemed to be a good idea, since there wasn't any other vialble API around.

So we arrive back at the present day. Epic has an engine that is still very Glide-centric. And 3dfx is borderline on the ash heap of computer history. This needs to be fixed. Now, nVidia is the current 3d card of choice, but you know what? It HAS NO API. No hardcore to-the-metal API like Glide is. So...

As Apache stated in <B>#10</B>, this whole thing is moot. It doesn't matter if Epic is coding their nextgen engine to do D3D, or OGL. Both are set standards. D3D works on any Windows box, OG works on any box with supported drivers from hardware vendors.

Bah. This has taken me 20 minutes to type. Having a broken collar bone is a pain in the ass.

Mark/Mad_Dog<I><B></B></I><I></I><I></I>
#24 by "Bracket"
2000-09-04 05:31:32
thebracket@yahoo.com http://borealis.eyep.net/
From reading what Epic have been saying over recent months, I'd be inclined to say that they will be supporting Nvidia cards by default more than in any specific manner - and that this agreement is little more than mutual advertising bunk.

My reasoning for thinking this is that Epic have said a lot about focussing all future engine development on Direct3D, and Tim Sweeney has repeatedly commented upon using DirectX 8 and 9's features. Since DirectGraphics (the Direct3D/DirectDraw merger in DirectX 8) is pretty much a hardware implementation of the GeForce 2 (and later NVx hardware), its no surprise that their future projects work best on Nvidia's chips. ATI and other manufacturers have some wonderful hardware - but its not quite as close to the DX8 specification, and therefore probably won't work as well.
#25 by "Greg"
2000-09-04 05:32:39
I would not read into this very much. It is not a dig at 3dfx or ATI. It is really just a PR statement that nVidia will pay full attention to Epic when Tim Sweeney calls them up saying he wants to test UT2/Next Big Game with nVidia's line of cards. I know that 3dfx, ATI, and Matrox have similar deals with developers, possibly even Epic. It is called developer relations, plain and simple. I've dealt with the ATI and Matrox DR in a small way (not for 3D, but for their Video in a Window cards) Aside from some marketing noise, I don't understand why this is a big deal.

Greg
#26 by "Paul"
2000-09-04 05:46:06
paul@paulbullman.com http://www.paulbullman.com
I'm not too sure we can draw any conclusions from this. I tend to agree with Apache, who said it's "PR CRAPOLA"

It is probably prudent to wait it out, and see how things develop. Until we see the code in action, it's a bit counter productive.

Until then, the Bills are up 16 - 13 with not much time left.. can the titans do it again?

Paul
Shrinkweb.com
#27 by "Morn"
2000-09-04 05:54:28
morn@planetcrap.com http://www.planetcrap.com
Gnargh.

As long as Epic don't use a new proprietary API by nVidia, I don't see much of an issue here.

- Morn
<I><B></B></I><I></I><I></I>
#28 by "Jafd"
2000-09-04 05:56:15
jafd@zombieworld.com http://jafd.isfuckingbrilliant.com
Hey! Great news! Especially since my video cards don't have any problem running D3D. Unlike some... other... cards.

It's gonna take more than some advertising gimmick to get me to drop my 2xVoodoo^2s :)<I><B></B></I><I></I><I></I>
#29 by "Ilich"
2000-09-04 06:10:35
While all video cards nowadays support D3D and OpenGL, there is a world of difference in the way you would optimize for certain hardware.  In the case of the GeForce series and the V5, code which will speed up performance on one card will have no effect on the other, or perhaps even slow it down.  The fact is that right now you would have to write two different rendering cores to get the best speed out of both chipsets.

So unless we see a similar announcement from 3Dfx & Epic, we can pretty much count on them releasing a basic engine that will work on any hardware, and a very optimized one for nVidia's chips.  And given that MS said Sweeney would be involved in DX8 development, I think it's pretty clear what Epic will be spending their time on: X-Box.
#30 by "None-1a"
2000-09-04 08:00:16
none1a@home.com http://www.geocities.com/none-1a/
<b>#17</b> "Chris Johnson" wrote...
<QUOTE>Are the T&L unit, etc, a specifically nVidia thing? Even if increased D3D support does help other cards/chipsets as well, wouldn't the code for thsi sort of thing still be more of an advantage to nVidia? </QUOTE>

Yes and No. Nvidia was the first out the door with a hardware T&L engine for consumer level cards, since it was the first and only one avalible at the time Microsoft basicly had to use it as the base for DirectX hardT&L. There are some changes here and there, but at it's core it's Nvidia's setup.

On a side note ATI would have more reason to get excited about a pure D3D T&L out of UT. The Rendon is builted around some of DirectX 8 (T&L engine mostly).



--
None-1a.

O forget it.<I><B></B></I><I></I><I></I>
#31 by ""
2000-09-04 08:14:04
<i>Thinking...</i>
#32 by ""
2000-09-04 08:14:19
<i>Thinking...</i>
#33 by "George Broussard"
2000-09-04 08:51:09
georgeb@3drealms.com
Epic's not doing one thing wrong here Andy.

D3D is really the industry standard.  Glide is dying and really dead now.  Most people will not support OpenGL because it has driver issues across cards, and requires the card people to commit to an OGL driver.

That leaves D3D.   May not like it.  But MicroSoft isn't going anywhere and they own the driver market now.  Period.

D3D is the future of PC games. Get used to it.

All Epic is doing (I'm sure) is making sure their news stuff runs on nVidia cards with all the bells and whistles.  If Epic isn't using D3D then that's another issue, but I suspect they are.

George Broussard, 3D Realms
#34 by "Valeyard"
2000-09-04 09:09:11
valeyard@ck3.net http://www.ck3.net
This is the change I've been arguing should take place for a LONG time.

This is NOTHING but good.  The problem with targetting 3DFX is that they use a <b>proprietary</b> API.  Designing for that API automatically limits your target audience and makes it complicated to "reverse engineer" the core rendering engine to work with other APIs...thus the problems nVidia owners are currently seeing with UT-based games in D3D.

By targeting nVidia, they're not latching on to a proprietary API.  They'll have full OpenGL and D3D support, and it's a relatively simple matter to design for those APIs and <b>still</b> include support for the hardware-specific features on systems that support it.

In my opinion, four years ago, Epic made a logical choice (though arguably not the best choice).  At the time, it looked like a good idea, as 3DFX was poised to take over the 3D-chipset world.  3DFX dropped the ball.  Now Epic is in a position to switch their focus, and I believe they're making the right decision.

3DFX probably won't roll over and die, but they need to do some restructuring to ensure that their hardware and drivers fully support D3D and OpenGL at a competitive performance level.  
IMO, GLIDE has had it's day.  We've moved beyond what can be accomplished with a proprietary subset of OpenGL features, today's games (and tomorrow's) will require much more than the 3DFX/GLIDE combination can provide.

-Valeyard<I><B></B></I><I></I><I></I>
#35 by "Valeyard"
2000-09-04 09:25:16
valeyard@ck3.net http://www.ck3.net
<b>#Main Post</b> "andy" wrote...
<QUOTE>Gee, I wonder if it will all go horribly wrong...</QUOTE>

I don't see how.  When you weigh the available hardware performance, nVidia comes out on top.  When you weigh the APIs, D3D comes out on top.

Unless another vendor comes out with a radically new hardware design <b>AND</b> it requires a proprietary API <b>AND</b> it can garner support from developers and consumers....the situation isn't going to change for a while.

When you consider game development time of 18-36+ months, the decisions made today will affect the industry for at least that long.  That doesn't imply that the industry will stagnate...the key is scalability.

Even if someone came up with a 3D chipset that pushed twice as many poly/pixels as a geForce2 Ultra, it'd <b>have</b> to support the open-standard APIs <i>just</i> to establish a competitive edge.  Developers simply aren't going to risk jumping ship to a new, untested product with a small user-base.

Considering that, if this wonder-chip popped into existance tomorrow, and DOES support the open-standard APIs...we've lost nothing.  With proper LOD support, the games can be made to scale up to the new hardware or remain playable on the average user's system.

This is a win-win situation for Epic and Gamers.  It may also give 3DFX the wake up call they need to leap back into competition.

Open-standards are good, scalability is good.  Mmm'kay? :)

-Valeyard<I><B></B></I><I></I><I></I>
#36 by "Andy"
2000-09-04 09:35:51
andy@planetcrap.com http://www.meejahor.com/
Okay, either you've all misunderstood, or I've misunderstood, or God help me I've been suckered big-time by a press release.

I took the press release to mean that Epic will be designing their games to work best on NVIDIA cards. I don't mean that Epic games would be designed for some new, as-yet-unknown NVIDIA API, but that features would be implemented that would look/work great on NVIDIA cards, but not necessarily work so well (if at all) on other cards.

Stupid example, just to illustrate what I mean: NVIDIA cards can render photo-realistic skies using one D3D command. Epic uses these skies in their next engine, and people with other cards get some dodgy mid-90's DOOM sky.

So yeah... me, suckered by a press release. I bow my head in shame.
#37 by "Rambar"
2000-09-04 10:52:52
<QUOTE>So yeah... me, suckered by a press release. I bow my head in shame.
</QUOTE>

Well, its that or you could just assume Epic and/or Nvidia is full of shit and that what this really means is that D3D will work like it should, but with a few specifc and meaningless Nvidia graphical features.  Ah damned if you do and damned if you don't Andy.
--
Rambar
#38 by "George Broussard"
2000-09-04 11:14:45
georgeb@3drealms.com
Andy,

I took the press release to mean that Epic will be designing their games to work best on NVIDIA cards. I don't mean that Epic games would be designed for some new, as-yet-unknown NVIDIA API, but that features would be implemented that would look/work great on NVIDIA cards, but not necessarily work so well (if at all) on other cards.

What other chipsets are worth a crap besides nVidia and 3Dfx?  Sorry, they all pretty much such outside of those two.

The point is to do things in one API and let it use whatever features the cards can.  Sure, nVidia cards will likely look the best for now, and if you have some cheap ass card it's going to be less acceptable.

But how is that a bad thing?  The days are gone of trying to hand tune a game to look good on several cards at once.  The hardware people better get in line with features if they want games to look good on their cards.

There will always be a couple of APIs out there, but anyone that thinks D3D isn't going to be the primary one is mistaken.

George Broussard, 3D Realms
#39 by "Warmonger [AI]"
2000-09-04 11:30:21
warmonger87@hotmail.com
I've said it before and I'll say it again:
Quake 3 engine for OpenGL
Lithtech engine for D3d
Unreal engine for GLide

So, just code the entire game in each of these engines, and you're set!

<I><B></B></I><I></I><I></I>
#40 by "xero"
2000-09-04 11:56:06
xero@tweak3d.net http://www.tweak3d.net
Hey Warren? I just thought I'd let you know that the Linux port of UT is fucking <b>awesome</b>. For those of you who didn't catch it the first time, I said <b>awesome</b>.

S3TC support, an OGL renderer that runs faster than the Windows D3D, beautiful smooth stutter-free playing (hello Linux memory managment ;), faster load times, volumetric lighting that looks ass kicking, blah blah blah... you guys did the right thing hiring Loki.

I could rant for a while on this one... When a friend of mine saw it, he said "Holy shit, it's like Quake3 now, only it includes fun."... UT's back on my harddrive, on it's own 2.5GB Linux partition (yeah, I installed it just to see if it was better).

Thanks for the support Epic. :)


As for you Andy, I think it was just a simple misinterpetation mistake on your part... Epic may very well support features only NVIDIA can do, <B>at the time</b>... but I doubt they'd support proprietary technology avalible and applicable only to NVIDIA.

Another thing to consider is: These days if a game can run efficiently on an NVIDIA card using D3D or OGL, it should run well on other cards as well as long as their drivers are in good shape, right? NVIDIA so far has avoided proprietary technology and proprietary APIs, so if a game is optimized for a card with fairly standardized features, it should do well for most cards.
#41 by "xero"
2000-09-04 12:06:31
xero@tweak3d.net http://www.tweak3d.net
<b>George Broussard</b> (#33) wrote:
<quote>D3D is the future of PC games. Get used to it.</quote>

I beg to differ. Unfortunatly, OpenGL won't die because it's multiplatform and not OS-specific. D3D is going to eat some market share, but there will still be a large section using OpenGL because it's what's supported under Linux, Unix, BeOS, and Mac.

Honestly, the only major thing I see pulling for DX here is the fact that D3D will be integral to the design of the Xbox. With titles being easily portable between PC and Xbox if D3D is used, that's obviously going to sway a huge amount of developers over.

But unless and until you see MS produce DX/D3D for other platforms (which is doubtful... but hey, they <b>are</b> porting Office to Linux :), you're not going to see D3D take over.
#42 by "Flamethrower"
2000-09-04 14:24:19
blah http://blah
Carmack is only interested in Doom 3 on the X-Box if the X-Box has OGL implementation. So it will have it.

Andy's italics-source in the other thread (regarding 'Molehill Erectus') is spot-on, from now on all George Broussard has to do is say <b>Meep Meep</b> whenever he wants to reduce Andy's arguament to rubble.
#43 by "godZero"
2000-09-04 14:39:44
godZero@gmx.de
Even without proprietary API, a programmer can use hardware-specific functs of nVidia cards directly (surpassing D3D). Call it a driver hack, if you want.

As far as I know, DX8 will support all this stuff anyway, so there probably will be no need to make any workarrounds, everything should work out of the box (...heard this so often before :-)).

I believe nVidia just wants to ensure that the damn thing will work fine with the next incarnation of Unreal engine, not more than that. Also, I think Epic guys wouldn't be so stupid to do the same mistake again. My two cents...
#44 by "godZero"
2000-09-04 14:41:18
godZero@gmx.de
Andy, why don't you just ask Warren M. about it?
#45 by "Dethstryk"
2000-09-04 15:19:50
dethstryk@damagegaming.com http://www.damagegaming.com/
<b>xero wrote in post #41:</b>
<quote>D3D is going to eat some market share, but there will still be a large section using OpenGL because it's what's supported under Linux, Unix, BeOS, and Mac.</quote>
"Large" section? Maybe this is one of those Linux things, but I really doubt that the market share is really being eaten up by that many Linux, Unix, BeOS, and Mac game users. Maybe it's just me.


--
Dethstryk
Damage Gaming
#46 by "DevPac2"
2000-09-04 15:44:58
devpac2@hotmail.com
One thing that i've been thinking about, especially since NVIDIA's involvement with the X-Box is how much control they will have over the content of DX. Given how much is riding on the X-Box i'd have thought that MS will have given NVIDIA as much knowledge as they need for designing and implementing a DX8 chip. Also MS may well be asking NVIDIA 'how do you want us to implement function Y?' I think that NVIDIA will have the best DX8 chip out there because of this and i'm just wondering about the knock-on affects of this.
(mindless speculation admittedly :), but its something i keep thinking about)

Dev
#47 by "mcgrew"
2000-09-04 17:25:08
mcgrew@famvid.com http://theFragfest.com
In a recent thread, Hulka popped in swearing because Unreal Tourney was hosing his machine. It seems he had an Nvidea card.

I have an Nvidea and seemingly had the same show stopping horrible bug. When the program ran, the screen was garbage. In my case, the game's INI files were pointing to the unused and inactive inboard video chip instead of the Creative TNT card.

Before I read the other comments that have probably addressed this, the odds that this horrible show stopper only happened to two people are slim. My guess is both Nvidea and Epic have taken a lot of heat and are getting together to clear both their names.

-Steve
#48 by "Crusader"
2000-09-04 20:01:50
crusader@linuxgames.com http://www.linuxgames.com
George Broussard wrote:
<quote>
What other chipsets are worth a crap besides nVidia and 3Dfx?
</quote>
Guess you haven't kept up with the ATI Radeon George, which compares favorably to the V5 and the GeForce chipset families.

However much you may wish for one OS, one solution, one API, one company, it won't happen. Choice is a <b>good</b> thing.
#49 by "Dethstryk"
2000-09-04 20:06:47
dethstryk@damagegaming.com http://www.damagegaming.com/
I will give Andy some credit on the topic title. I can't read it enough. :)


--
Dethstryk
Damage Gaming
#50 by "Matthias Worch"
2000-09-04 20:19:33
mworch@legendent.com http://www.langsuyar.com
#49" When I first read the topic I seriously thought it was gonna be a thread about Daikatana :)

<I><B></B></I><I></I><I></I>
C O M M E N T S
Home » Topic: Walk Into Wall. Bleed. Repeat.

|«« - Previous Page - Next Page - »»|
P O S T   A   C O M M E N T

You need to be logged in to post a comment here. If you don't have an account yet, you can create one here. Registration is free.
C R A P T A G S
Simple formatting: [b]bold[/b], [i]italic[/i], [u]underline[/u]
Web Links: [url=www.mans.de]Cool Site[/url], [url]www.mans.de[/url]
Email Links: [email=some@email.com]Email me[/email], [email]some@email.com[/email]
Simple formatting: Quoted text: [quote]Yadda yadda[/quote]
Front Page (ATOM) • Submission Bin (2) • ArchivesUsersLoginCreate Account
You are currently not logged in.
There are currently 0 people browsing this site. [Details]