PlanetCrap 6.0!
Front Page (ATOM) • Submission Bin (2) • ArchivesUsersLoginCreate Account
You are currently not logged in.
T O P I C
Cg: nVidious Plot or Graphical Revolution?
June 20th 2002, 01:55 CEST by m0nty

Slipping in amongst the news of Morrowind, NWN and WarCraft 3, nVidia and Microsoft have unveiled what they claim is the strongest push yet to bring cinematic-quality real-time graphics from the movie screen to the PC. The introduction in version 9 of Microsoft's DirectX of a High-Level Shading Language (HLSL), revealed earlier in 2002 and which has just gone into beta, have been trumped by nVidia, which has announced a high-level programming language called Cg - short for "C for graphics".

The common, familiar C-like syntax enables rapid development of stunning, real-time shaders and visual effects for graphics platforms, and is compatible with Microsoft's recently announced High Level Shading Language for DirectX® 9.0.

With Microsoft around, you know there will always be standards issues (remember OpenGL?), and this is no different. Cg represents nVidia's latest salvo in its battle with ATI and other graphics card vendors for the hearts and minds of developers over their preferred programmable shaders and 3D APIs. The vendors have already started attacking each others' DX9 support, and Cg brings a whole new set of arguments about cross-platform implementation and proprietary technologies. Let the new standards war begin!

If there is to be a war, nVidia surely have the metaphorical SoF2 briefcase camped beyond all hope of the terrorists escaping with it. In this interview with nVidia svengali David Kirk conducted by new Eurogamer offshoot Gamesindustry.biz, HLCL is even characterised as "Microsoft's own implementation of Cg", suggesting that nVidia is dictating terms to the dreaded Redmondians. Kirk fends off the inevitable question about incompatibility with non-nVidia hardware thusly:

"Our compiler generates shader code and sends it to DirectX or OpenGL, and shaders are a standard, so they should run on any card that supports the shader standards, including our competitors? Besides, I think it's in our interest to make sure that Cg runs well on everything - we want people to really use this technology, and that's all about taking away their reasons not to... Making the compiler so that it didn't work well on ATI cards, for example, would be really bad for us too."

Do you trust this man? Has nVidia created Cg out of the goodness of its heart, or is it really just another attack on its competitors? Does it matter, or should we just accept nVidia's dominance of the graphics card market? Will the majority of programmers choose Cg with all its supporting apps, or play it safe with the more limited HLCL? Will Cg become the new Glide - the troublesome rendering technology that was left behind when 3Dfx died? What does it mean for graphical engine makers, like id, Epic and Lithtech? What does this mean to games already in development: will they have to redo parts of their code? (Hi George B!)
C O M M E N T S
Home » Topic: Cg: nVidious Plot or Graphical Revolution?

|«« - Previous Page - Next Page - »»|
#1 by LPMiller
2002-06-20 01:57:38
lpmiller@gotapex.com http://www.gotapex.com
Words......words......words......summary......Ok. Answer: Yes.

I believe I can fly......urk.
#2 by LPMiller
2002-06-20 01:58:51
lpmiller@gotapex.com http://www.gotapex.com
You also need to realize, CG does not equal Glide - it's not the same thing at all.  Glide was a unique API like OpenGL or Direct X, where CG allows you to program FOR OpenGL or Direct X.

I believe I can fly......urk.
#3 by Matthew Gallant
2002-06-20 01:59:18
http://www.truemeaningoflife.com
I don't see this as nVidia trying to control graphics development at all. It looks like a useful tool, not a new standard.

Current market value of the Max Payne IP according to a comparison of the market capitalization of Take Two pre- and post- sale: approx. -$244,000,000.
#4 by EricFate
2002-06-20 01:59:21
nVidia has competitors?
#5 by LPMiller
2002-06-20 02:01:27
lpmiller@gotapex.com http://www.gotapex.com
Really, It's sort of like arguing if releasing C++ is the new salvo in the Coding Wars or something.

Of course, how well one can use CG to program OpenGL calls for a R300 remains to be seen....but I think you misunderstand what CG is supposed to be.

I believe I can fly......urk.
#6 by LPMiller
2002-06-20 02:02:19
lpmiller@gotapex.com http://www.gotapex.com
Can we go off topic now? I hate graphic discussions.

So, how about that newegg.com? Anyone ever been there?

I believe I can fly......urk.
#7 by HiredGoons
2002-06-20 02:03:32
I lack the technical expertise and do not have access to certain industry secrets that would permit me to form a reasoned opinion on this topic.

Consequently, I plan to take a firm stand on this issue and heap flaming turds on anyone who disagrees with me.
#8 by LPMiller
2002-06-20 02:05:49
lpmiller@gotapex.com http://www.gotapex.com
You're on!!

I believe I can fly......urk.
#9 by Shadarr
2002-06-20 02:07:36
shadarr@yahoo.com http://digital-luddite.com
nVidia has competitors?


Exactly.  You want a top 3d card, you buy a GeForce4.  You want a reasonable 3d card, you buy a GeForce2.  It's very hard to find a non-nVidia card these days, and not worth the effort.
#10 by LPMiller
2002-06-20 02:09:21
lpmiller@gotapex.com http://www.gotapex.com
Apparently, you missed all that shelf space ATI gets. Which, btw, is a fine card in its own right, now that the drivers aren't sucking so much.

And the R300 may very well put ATI ahead of the curve. Has to happen sooner or later, at any rate.

I believe I can fly......urk.
#11 by Max Diablos
2002-06-20 02:11:04
Since a few big-shots and the lazy media started yapping on about "cinematics" and "immersiveness" the general population seem to act more like a bunch of Pavlovian dogs than ever before. But, that is by the by. I see no particular reason to rush into supporting NVidia's Cg when OpenGL 2.0 is just around the corner. Unlike NVidia's land-grab it's a genuinely open standard. On the issue of D3D, it looks like Microsoft has been spoon fed by NVidia since the introduction of shaders in D3D8, and this trend seems set to continue with D3D9. This is what some people might call a reverse takeover.

No helter skelter. No over the rainbow bad trip apocalypse. Just us and this moment now. This is how it ends.
#12 by Shadarr
2002-06-20 02:12:50
shadarr@yahoo.com http://digital-luddite.com
I don't know about shelf space, but when I was picking what I wanted in my new computer there was one cheapo 32meg card, a bunch of GeForce2 64meg cards, and an assortment of GeForce3's and 4's.  That was it.  I assume I could've special ordered an ATI, but I didn't see any point.
#13 by Mister Nutty
2002-06-20 02:17:53
The answer is neither.   Cg's a useful tool for building shaders in a C-like (and by extention, Renderman-like) format instead of asm..  That's all.  Yes, DX9 will have similar technology.  Yes, OpenGL2 will have similar technology.  But neither of those are shipping (and DNF will probably ship before there's a real OpenGL 2 implementation since its probably a couple years off just from standardization) and Cg is.

I think NVidia should get the benefit of the doubt that they are not trying to pull a closed-standard grab.  They clearly saw what that did to 3dfx over the long term since they rode that train to their current position.  And in the past Nvidia had always been supportive of open rendering APIs as much as possible.

I think too much is made of the fact that Cg is not vendor-neutral under OpenGL.  What should NVidia do, code in support for all vendor's register combiner extentions?  ATI could just as easily fix the problem by implementing nvidia's extention mechanism for their drivers.  

Anyone who has dealt with the tools and technologies that NVidia publishes through their developer relations network knows that in virtually all cases their stuff is generally useful for all GPUs, not just their own.  They just want to help push developer/user acceptance of the GPU concept and then stand back and let their boards sell based on the strength of their brand and the quality of their hardware and drivers.

Smashing!
#14 by crash
2002-06-20 02:19:41
from 0, and the links, the questions:

Do you trust this man?

haven't met him, so can't say. judging personality from a press release... yeah. no.

Has nVidia created Cg out of the goodness of its heart, or is it really just another attack on its competitors?

the goodness of its heart is usually an attack on competitors. this isn't an either/or. it's not a direct attack, but it's certainly a case of feature one-upmanship; i.e. another reason to choose nvidia instead of someone else.

Does it matter, or should we just accept nVidia's dominance of the graphics card market?

until and unless there is a real competitor to nvidia's current market-share peak, this question is sort of irrelevant. unless, of course, you're working on a potentially competitive chipset. let us know. if something comes along that's better than nvidia, i'll buy it. nothing has for a while. QED.

Will the majority of programmers choose Cg with all its supporting apps, or play it safe with the more limited HLCL?

i think they'll try both and see what works best. both go with DX9, so it's not like they'll lose all that much trying 'em both out.

Will Cg become the new Glide - the troublesome rendering technology that was left behind when 3Dfx died?

Cg isn't a rendering technology. it's an API that apparently makes working with DX9 easier/faster/more convenient. apples and oranges.

What does it mean for graphical engine makers, like id, Epic and Lithtech? What does this mean to games already in development: will they have to redo parts of their code?

in my opinion, it means "If we can implement it smoothly and quickly we'll use it for this game; if we can't, we'll use it for the next one." that's what usually happens, anyway.

Whoops, sorry, was my common sense showing again? -HoseWater
#15 by Mister Nutty
2002-06-20 02:20:48
ATIs are a lot more popular than some of you suggest, and their high-end hardware is top notch, especially in bang-for-buck ratio.  Their drivers do still need a lot of work compared to Nvidia's but they are steadily improving.

And Max, OpenGL 2 is so far from right around the corner its not funny.  The ARB does not move quickly.  Say all you want about open vs closed standards, but there are certain benefits to totalitarianism, which is why DX9 will get at least a year (and very likely more) of a jumpstart on OGL2.

Smashing!
#16 by Mister Nutty
2002-06-20 02:23:33
crash:

Cg is cross-API.  It works with OpenGL 1.1/1.2 and DX8 right now.  It doesn't need DX9, which incidentally has its own high-level shader language that is very similar but not exactly the same (the two will be compatible at the intermediary code level).

Yes, I'm picking nits.

Smashing!
#17 by LPMiller
2002-06-20 02:28:06
lpmiller@gotapex.com http://www.gotapex.com
Since a few big-shots and the lazy media started yapping on about "cinematics" and "immersiveness" the general population seem to act more like a bunch of Pavlovian dogs than ever before. But, that is by the by. I see no particular reason to rush into supporting NVidia's Cg when OpenGL 2.0 is just around the corner. Unlike NVidia's land-grab it's a genuinely open standard. On the issue of D3D, it looks like Microsoft has been spoon fed by NVidia since the introduction of shaders in D3D8, and this trend seems set to continue with D3D9. This is what some people might call a reverse takeover.


Oh geeze. There is nothing to support, anymore then Adobe needs to 'support' Visual Studio.Net


Plus, it's art.

I believe I can fly......urk.
#18 by Max Diablos
2002-06-20 02:29:05
will probably ship before there's a real OpenGL 2 implementation since its probably a couple years off just from standardization


Pardon? OpenGL 2.0 is due spring 2003.

OpenGL Shading Language Compiler

This zip file contains the source code and project files necessary to build the OpenGL Shading Language compiler as a standalone executable for Windows platforms. It is 3Dlabs intention to help OpenGL move forward by providing source code to the OpenGL Shading Language parser and intermediate-code generator that we have implemented. We are making this code available in the hopes of encouraging rapid development of OpenGL 2.0 implementations as well as common tools and utilities.

Once the OpenGL Shading Language compiler is built, it is capable of parsing and generating intermediate code for vertex shaders and fragment shaders written in the high-level language defined by the OpenGL 2.0 Shading Language white paper. It is intended that other vendors would develop hardware-specific back ends that turn the intermediate code into device-specific machine code.


The difference between the OpenGL 2.0 solution and NVidia's is that OpenGL 2.0 isn't closed in the sense that NVidia's is. If NVidia span off Cg as a seperate entity and allowed other IHV's to participate as full board members as with the OpenGL ARB, I might be persuaded that this isn't just a greedy landgrab.

No helter skelter. No over the rainbow bad trip apocalypse. Just us and this moment now. This is how it ends.
#19 by crash
2002-06-20 02:32:28

Yes, I'm picking nits.

no worries. learned somethin i didn't know. (btw, this is why i use the word "apparently" when i'm not sure i understand something. and a good thing, too, come to think of it.)

Whoops, sorry, was my common sense showing again? -HoseWater
#20 by Max Diablos
2002-06-20 02:42:05
Oh geeze. There is nothing to support, anymore then Adobe needs to 'support' Visual Studio.Net


Support in the sense "use" not "fanboy."

ATIs are a lot more popular than some of you suggest, and their high-end hardware is top notch, especially in bang-for-buck ratio.  Their drivers do still need a lot of work compared to Nvidia's but they are steadily improving.


Their driver team is a lot better than it used to be, and they've certainly been stung into action. In some respects I think what happened to 3Dfx and the rise of NVidia forced their hand. I remember reporting an annoying OpenGL bug to the engineering team a few years ago and the response was very positive. In fact they pulled the release of the driver until they had cleaned up a few of the issues they probably would've let slip through the previous year.

And Max, OpenGL 2 is so far from right around the corner its not funny.  The ARB does not move quickly.  Say all you want about open vs closed standards, but there are certain benefits to totalitarianism, which is why DX9 will get at least a year (and very likely more) of a jumpstart on OGL2.


See George Broussard for a definition of "soon." OpenGL 2.0 is due within the next year. As for the speed of ARB movement I think you'll find they're moving a lot faster than they used to do. But, speed shouldn't come at the price of creating an unuseable and unuseful standard. I'd rather they got it right first time.

No helter skelter. No over the rainbow bad trip apocalypse. Just us and this moment now. This is how it ends.
#21 by LPMiller
2002-06-20 03:01:45
lpmiller@gotapex.com http://www.gotapex.com
Yeah, I understood what support meant, loverboy. You're still about as far off base as you can get.

Let me state it in plain language. CG is NOT AN API in an of itself.

It is NOT Glide.

It is NOT like Open GL.

It is NOT like Direct X.

It allows you to DO things in Open GL and Direct X. Hardly a land grab. Devs can either use it or not; it won't limit their abilities to NOT use CG for the work they do. It should give them Another tool that might make certain functions easier - thats up for the devs to say.

I believe I can fly......urk.
#22 by Dinglehoffen
2002-06-20 03:05:43
Fanny Fungus
SoF 2 STINKS.

"Cause you'll be LIVIN' IN A VAN DOWN BY THE RIVER EATIN' GOVERNMENT CHEESE!"
#23 by AnalFissure
2002-06-20 03:13:23
It reeks of fun.
#24 by Max Diablos
2002-06-20 03:16:27
LPMiller,

Being rude or saying the complete opposite of what you mean doesn't help your case.

Cg is an abstraction that floats above D3D and OGL. In that respect it's as much an API as anything else. Don't bother arguing it. The point is technically correct. Landgrab applies in a wider sense than the narrow one you're using. NVidia are taking the initiative and grabbing mindshare of coders who might want to produce shader solutions to graphic problems. They are jumping ahead of similar intiatives that might be made by other IHV's. It might even be used as a trojan horse for NVida specific OpenGL extensions. Some people might even question the need for D3D or OGL shaders after Cg gets a full head of steam. All you can be certain of is that nothing is black and white. As for what others decide that's up to them.

No helter skelter. No over the rainbow bad trip apocalypse. Just us and this moment now. This is how it ends.
#25 by chris
2002-06-20 03:17:37
cwb@shaithis.com http://www.cerebraldebris.com
Correct me if I'm wrong, but couldn't Carmack just write his own shader system if he didn't want to use Cg? It's not like you HAVE to use Cg to interface with any of the available, or upcoming, graphics APIs.

-chris
#26 by Darkseid-D
2002-06-20 03:18:14
rogerboal@hotmail.com
amusing, since ATI are apparently shaping the direction of DX9 (beta testing in late august I believe).

horrifying since ATI cant code drivers any more than penguins can write slasher flicks whilst sitting in a bar in down town Ulan Bator.

oh, they cheat at benchmarks, sure ATI you get close to Geforce 3 performance with your `best` card in q3, hang on thats 16bit with low quality everything, but the settings are set for high... oh you have code that detects quake3 and optimises for speed whilst making everything look like ass, even if the user wants the eye candy.

joy.


Ds

Never argue with an idiot, theyll drag you down onto their level, then beat you with experience.
#27 by MCorleone
2002-06-20 03:26:41
I won't leave nvidia for anything save a severe Titanic'ing of the 3dfx sort.  In my youth I touted Matrox.  I waited with baited breath for the G200.  It arrived.  It was okay, the drivers were absolutely horrible.  No OpenGL ICD.  A wrapper came along about 20 revisions down the pike that jury-rigged Quake2, and didn't work with anything else.  It was never fixed.  I bought a TNT.  I have never looked back and never will.  Driver support from nvidia is outstanding, and they even milk another 5% on average out of an aging card when they drop the next best thing into the market.  A super-powered card with shitty drivers is a 200 bedroom mansion with no doors.  Thanks, no.

Build a man a fire and you'll keep him warm for the rest of the night.  Light a man on fire and you'll keep him warm for the rest of his life.
#28 by Mister Nutty
2002-06-20 03:39:21
Chris,

  That is correct.  You don't have to use Cg.  You can still use DX8 style assembly shaders, or NV's register combiners, or ATI's register combiners (for OpenGL work).  When DX9 comes out you can use any of those, or DX9's high level shading system.  When OpenGL 2 comes out (spring 2003? I'll believe it when I see it...they might as well just say "when its done" like 3DR does) you can use OpenGL's shading language.  Its up to you as a developer!!

Choice isn't a bad thing, especially when one of those choices (Cg) allows you to start getting things done right now.  


Also:

Its not like NVidia is actively breaking anything here.  WRITING SHADERS FOR OPENGL (1.x) is ALREADY VENDOR SPECIFIC!    Even if you don't use Cg you have to write two seperate implementations of your shader system because NVidia and ATI haven't agreed on a common extention, and likely never will since everyone will just wait for OGL2 for that.  So its not like Nvidia is taking anything away from you.  All they've done is made it EASIER to do the shader work on their side.  In no way does it make it HARDER or DIFFERENT at ALL to write the shaders on the ATI side.   Based on that fact alone, any "land grab" theory is just insane.  If Nvidia has this technology available NOW, why should they sit back and wait for the ARB to get its shit together on OpenGL2?

And if you're writing to DirectX8, Cg already works under both NVidia and ATIs cards (and anyone else who implements the DirectX shader system(s)!  

In no way did NVIDIA go out and make this stuff not work on ATI's hardware on purpose.  Its vitally important to grasp the fast that for them to make it work on ATI/OpenGL would have required them to specifically code a backend layer for Cg on top of ATI's proprietary shader extentions for OGL.  Why in the fuck would they do this?  As a developer, I'd be happy if everything were nice and standard, but as a shaderholder of NVIDIA, I'd be really upset that they were spending money for the sole purpose of supporting a competitor's technology.

And lastly, its important to note that all of the upcoming high-level shader systems are so similar (because they all crib from Renderman) that transitioning from one to another would be fairly trivial..So use Cg for now and then move to DX9's HLSL or OGL2.0 later. Its really not that hard based on what's been released for Cg and what's in the DX9 beta and OGL specification.

Smashing!
#29 by LPMiller
2002-06-20 03:43:08
lpmiller@gotapex.com http://www.gotapex.com
Being rude or saying the complete opposite of what you mean doesn't help your case.


Ok, I didn't use the term fanboy, Mr Pot, nor am I trying to make a case. I'm trying to edify your obviously country hick education with a little coming to Jesus party.  As for Rude, coaching your language in flowery words I'm not positive you've proven you understand the actual meanings to does not infact make you polite.

Or, if you prefer, Fuck you.

Cg is an abstraction that floats above D3D and OGL. In that respect it's as much an API as anything else.


No, it is not an API. If anything, it's GUI to DOS. It's an extra layer that the Carmacks of the world don't need, but will help out those with lesser video kung fu. An API would have to be supported to function; CG can be ignored completely.

Don't bother arguing it. The point is technically correct.


No, not if you use the term 'technically' to mean 'based on or marked by a strict interpretation', as opposed to 'out your ass'. If you meant the latter, then you are correct, I shan't be able to argue with Assology.


Landgrab applies in a wider sense than the narrow one you're using. NVidia are taking the initiative and grabbing mindshare of coders who might want to produce shader solutions to graphic problems. They are jumping ahead of similar intiatives that might be made by other IHV's. It might even be used as a trojan horse for NVida specific OpenGL extensions. Some people might even question the need for D3D or OGL shaders after Cg gets a full head of steam. All you can be certain of is that nothing is black and white. As for what others decide that's up to them


That is the biggest crock of shit EVAR. Coders who can produce shader solutions will, the rest will CG. Or cut and paste Carmacks code, whatever works. Wait, let me go back a second to...

Landgrab applies in a wider sense than the narrow one you're using


Ah, the heart of the matter. Yet another example Max, of a word that everyone else seems to understand, but for you, apparently has it's own meaning. I'd ask to see the dictionary you use, but I'd be afraid of flowing pages that dripped to the floor and constant perspective shifts.

You can get away with the abstract bullshit when you are talking games, because you're just another idea man in a forum full of 'em.  It doesn't work so well when we actually enter the concept of reality.

Money talks, bullshit walks. But apparently, Bullshit also talks on and on for a really long time before it gets to the walking part. Are we at the walking part yet?

I believe I can fly......urk.
#30 by Mister Nutty
2002-06-20 03:49:53
Ds,

DX9 is already in (developer-level) beta test.  It should be in release by late Sept or so based on Microsoft's normal schedule.  And I would hardly say ATI is "shaping" it.  All the major vendors are inputting their 2 cents. Having said that, ATI has made a number of positive contributions to DX8 & 9, arguably more so than NVIDIA who really didn't have a great year in terms of innovation (possibly due to all the XBOX overhead).  But NVIDIA's got some cool shit up their sleeves.


And Max, are you insane?  Pixel & Vertex shaders are increasingly important, but they are but one of many features of a full graphics API.  Cg is in no way an attempt to create a new API.  NVIDIA is heavy into OpenGL (a lot of their engineers are ex-SGI guys who have been working on OpenGL since the beginning) and heavy into DX because of their all-important Microsoft partnership.

Smashing!
#31 by Max Diablos
2002-06-20 03:55:17
Mister Nutty,

Based on that fact alone, any "land grab" theory is just insane.
...

Its vitally important to grasp the fast that for them to make it work on ATI/OpenGL would have required them to specifically code a backend layer for Cg on top of ATI's proprietary shader extentions for OGL.  Why in the fuck would they do this?


Hello?

And lastly, its important to note that all of the upcoming high-level shader systems are so similar (because they all crib from Renderman) that transitioning from one to another would be fairly trivial..So use Cg for now and then move to DX9's HLSL or OGL2.0 later. Its really not that hard based on what's been released for Cg and what's in the DX9 beta and OGL specification.


Inertia.

...

Still, fair points well argued. Ultimately I think everyone has to make their own minds up on the functionality, useability, and portability issues, along with any pragmatics you want to throw into the pot. Long term I see more advantages in going for a pure OpenGL 2.0 solution. But, that's my opinion.

No helter skelter. No over the rainbow bad trip apocalypse. Just us and this moment now. This is how it ends.
#32 by Bailey
2002-06-20 03:57:20
LPMiller

Apparently, you missed all that shelf space ATI gets. Which, btw, is a fine card in its own right, now that the drivers aren't sucking so much.

That's sort of along the lines of "He's such a nice fellow now that he's stopped stabbing babies in the maternity ward while their families watch".

By the way Max, before you start trying to throw down on LP regarding the whole card/driver shebang, you mighta wanna research what he does for a living... i.e. review cards, drivers, etc. Just a thought.

I'm making a cyberdifference in an eCommunity populated with iFolk- *HURK* suicide
#33 by crash
2002-06-20 04:03:14
dammit, Bailey, you ruined it. was just gettin warmed up, too. LP's on a roll.

Whoops, sorry, was my common sense showing again? -HoseWater
#34 by bago
2002-06-20 04:07:10
manga_Rando@hotmail.com
After some analysis it seems that Cg is roughly the equivalent of the .NET Framework IL and CLR concepts for GPU's, providing a layer of abstraction betwixt the GPU and the Shader level of work, allowing for one shader routine to be translated into optimized versions per GPU, and also promoting code re-use and readability. As it stands now it seems that GPU's are still to simple to fully take advantage of this abstraction layer, but by the next generation this will prove to be rather valuable. Thusly, the goal of promoting it now is to get the idea into peoples heads so when the hardware comes out that can fully utilize this technology, the concept, training, and early adopter phases will have come to pass, allowing for rapid utilization on v.next.

iamelectro
#35 by Mister Nutty
2002-06-20 04:07:23
Max,  

  To agree upon some common ground, I think that a pure OpenGL 2.0 solution would be much better (at least on the OpenGL side.  All of the work I currently do is in DirectX so the issue is kind of moot to me for now).  However, if you are writing a game right now you can't really wait around for OpenGL 2.0 to be sorted out.  Even if they come in on time and the spec is finalized in 2003, there's still going to be a period of poor driver support that you'll want to avoid.  Hell it took years just to get to the point where you could reasonably rely on OpenGL 1.2 level standard extentions being supported on most major cards.

My only basic points are thus:

Cg works right now.  You can build it into a game engine right this minute.

Your only other option under OpenGL that is available right now is writing low-level shaders on top of
NV's and ATI's register combiner APIs.  This is already vendor specific so it gains you nothing.  So you might as well make use of Cg for the NVIDIA side of things unless you really like writing shaders in assembly, which some people in fact do.

Cg works in a vendor neutral manner under DX8 (right now) because Microsoft forces standardization of the DX API.  Of course you can make a case that its biased towards NVIDIA since there are some features available on ATI's GPU that Cg won't make use of.  But if you're going to use those ATI-only features you're going to be writing multiple versions of your shaders anyway.

In summary: Real-time shading languages are a new thing, its going to take time for the smoke to settle.  All we can do as developers right now is choose the solution best suited to us.  For many that will be Cg, at least until DX9 and OGL2 appear for general use.

I'm not an NVIDIA fanboi (though I do use their products), but I think its safe to give them the benefit of the doubt.  They are smart guys.  They don't want to compete with Microsoft (and thus won't try to make this a full API) for various reasons.  And they know full well what happened to 3dfx (particularly considering they have quite a few former 3dfx guys on payroll), they are too smart to attempt a repeat of that.

Smashing!
#36 by Mister Nutty
2002-06-20 04:09:09
bago,

   You are correct, sir.

Smashing!
#37 by Max Diablos
2002-06-20 04:14:43
#32 by Bailey

By the way Max, before you start trying to throw down on LP regarding the whole card/driver shebang, you mighta wanna research what he does for a living... i.e. review cards, drivers, etc. Just a thought.


Couldn't give a fuck. Job, titles, and honours, mean nothing to me. He's said nothing of value so far, is talking out of his a$$, and can't even manage to be half polite. All I've done this week is push around a different perspective on the game industry and offer first impressions of Cg, and all I get back is a display of frustration from an ego-driven potty mouth. He doesn't know the first thing about me or what I do so I'll let all the other comments slide. I'm here to be informed, educated, and entertained. Rude a$$holes I can live without.

No helter skelter. No over the rainbow bad trip apocalypse. Just us and this moment now. This is how it ends.
#38 by bago
2002-06-20 04:19:05
manga_Rando@hotmail.com
Max: you're wrong.. get educated. hehe

a language is seperate from an API.. Semantics mean things when they turn into numbers.

iamelectro
#39 by Mister Nutty
2002-06-20 04:19:13
I must add, however, (sorry for the multiple posting), that while Cg is quite limited compared to what shaders will be in a year or two, it does actually offer practical value right now (its not just a learning exercise).  Writing shaders in assembly by itself is not a particularly hard thing.. If you can grasp the power of shaders to begin with you can certainly learn to express them in assembly, especially since the instruction set is fairly small.  

At the very least, one thing Cg does is gives you a level of simple compile-time preprocessing that assembly level shaders don't have.  Since shaders don't have complex control logic in them (since most GPUs don't support that this generation), you often wind up with multiple shaders that are virtually identical except for very minimal alterations.   Cg minimizes this quite a bit since using its preprocessing you can generate the various shaders from a single source file based on some compile-time defines.  This makes maintenance easier.  This is, in fact, the primary reason I am using Cg.

Smashing!
#40 by Mister Nutty
2002-06-20 04:25:17
bago,  

  I have to disagree a bit.  Normal rules of semantics don't hold up in the software development industry.  One could in fact argue that Cg is an API.  Its an API to the GPU's shader system.  It is not, however, nor do I ever think NVIDIA intends it to be, an API at the level of OpenGL or D3D.  

 Think of it thusly, BSD-style sockets are an API..An API for communicating over TCP/IP to other computers on a network..  However, they don't subsume the entirety of the API to the OS.  The rest of the API to the OS could be Win32 or it could be UNIX/POSIX style.  The socket API exists to allow some amount of code reuse to this subsystem, it does not exist to become a full OS API on its on because that is nonsensical, and likewise I don't see Cg or any other shader language becoming the entirety of the API to the graphics hardware.

Smashing!
#41 by Max Diablos
2002-06-20 04:26:43
#35 by Mister Nutty

In summary: Real-time shading languages are a new thing, its going to take time for the smoke to settle.  All we can do as developers right now is choose the solution best suited to us.  For many that will be Cg, at least until DX9 and OGL2 appear for general use.


In a nutshell that's probably all that can be said. Whatever the "best" solution solution turns out to be you have to work within time pragmatics. As you say, pick the best solution for under the circumstances you're operating in, and take another look when the dust settles.

No helter skelter. No over the rainbow bad trip apocalypse. Just us and this moment now. This is how it ends.
#42 by Mister Nutty
2002-06-20 04:32:22
I'd like to make one other post in my bid to monopolize this thread:

Say ATI implemented their own Cg compatible compiler and NVIDIA threatened to sue over IP issues, OR OpenGL 2.0 comes out and NVIDIA decides not to support the shading language in their OpenGL 2.0 drivers and instead tells developers to "ride the Cg bus".

I'd be the first to slag those motherfuckers.

But so far NVIDIA (since I'm not some crazy Linux Open Source zealot who thinks closed source drivers are evil) hasn't done much to disappoint me.  So I give them the benefit of the doubt.

Smashing!
#43 by Max Diablos
2002-06-20 04:43:32
But so far NVIDIA (since I'm not some crazy Linux Open Source zealot who thinks closed source drivers are evil) hasn't done much to disappoint me.  So I give them the benefit of the doubt.


The "ah, but." Don't get me started on what I think about Linux and those grade A nutters. Really, the only thing that gets me animated is open standards. The only real problem I have with Cg is it being a closed standard. Allowing other IHV's to code backends isn't the same thing. I don't oppose Cg in principle or practice, it's just that I'm giving it a small "c" conservative welcome.

No helter skelter. No over the rainbow bad trip apocalypse. Just us and this moment now. This is how it ends.
#44 by Mister Nutty
2002-06-20 04:50:38
I like open standards too, but I'll tolerate closed standards that solve a real problem, so long as the closed standard doesn't actively try to torpedo a working open standard, unless that closed standard is DirectX which Microsoft did in fact use to try to torpedo the OpenGL open standard, because Microsoft has huge market share and even more than useful closed standards or open standards I like money and someone pays me money to write DirectX code.

Smashing!
#45 by Max Diablos
2002-06-20 04:53:54
#30 by Mister Nutty

And Max, are you insane?  Pixel & Vertex shaders are increasingly important, but they are but one of many features of a full graphics API.  Cg is in no way an attempt to create a new API.  NVIDIA is heavy into OpenGL (a lot of their engineers are ex-SGI guys who have been working on OpenGL since the beginning) and heavy into DX because of their all-important Microsoft partnership.


Missed this earlier. Yes, I know all that. I wasn't saying that NVidia were trying to landgrab a full 3D API feature set. I was saying they're trying to grab mindshare of the shader feature set in the minds of gamers and developers. Long term I don't think it matters much. But, never is a long time.

No helter skelter. No over the rainbow bad trip apocalypse. Just us and this moment now. This is how it ends.
#46 by bago
2002-06-20 04:54:52
manga_Rando@hotmail.com
Well, Technically speaking, an API is an Interface, a defined binary gateway to functionality. Technically the API portion would be the functionality exposed. The language is just a bunch of semantic rules for translation, although in this case, I can see how the lines are blurred, in that there is not binary gateway but rather translated functionality. But, by the same token, there is no Interface, thus no API.

Of course at a larger scale, it's still a bunch of words you gotta learn and type to make the box work.

iamelectro
#47 by Max Diablos
2002-06-20 04:59:13
I like open standards too, but I'll tolerate closed standards that solve a real problem, so long as the closed standard doesn't actively try to torpedo a working open standard, unless that closed standard is DirectX which Microsoft did in fact use to try to torpedo the OpenGL open standard, because Microsoft has huge market share and even more than useful closed standards or open standards I like money and someone pays me money to write DirectX code.


Democracy vs. dictatorship. Each has their advantages and disadvantages.

No helter skelter. No over the rainbow bad trip apocalypse. Just us and this moment now. This is how it ends.
#48 by jjohnsen
2002-06-20 04:59:52
http://www.johnsenclan.com
#22 by Dinglehoffen

 SoF 2 STINKS


I've played for about an hour.  I'm bored.


And i enjoy my ATI card very much thank you.  Never had a problem running anything until GTA3 (and I'm pretty sure thats a result of my PIII 500).
#49 by Dinglehoffen
2002-06-20 05:48:16
Fanny Fungus
I remember when my ATI neospherical probe nicked a abstraction on the VGI double-hemispherical loop wocket. *chuckles* Everyone in the office laughed when the binary reticule splooged all over the OpenGL VC's, allowing the POSIX to jacknife and slit the secretary's skirt right up to the nape of her neck. Well, we later bolted the Cg to the CGIU, causing a swarm of ex-SGI's to execute full body cavity searches on everyone involved with the abstraction nipple.

Oh man, those were the days, when those little things just made your day...

"Cause you'll be LIVIN' IN A VAN DOWN BY THE RIVER EATIN' GOVERNMENT CHEESE!"
#50 by Dinglehoffen
2002-06-20 05:51:48
Fanny Fungus
"I've played for about an hour.  I'm bored."

You've got many, many hours left of the most monotonous, tedious, redundant busy work on 323,555,67.8 warehouses that look exactly the same. All the important work - you know, the missions, the executions, the revenge motives - they ALL will be undershadowed by the busy work....looking for a tiny vent in a warehouse the size of China, or finding obscure crawlspaces to scuttle a Titanic-sized boat. Oh, you'll just have so much fun.

I'd take the game back now before it's too late.

"Cause you'll be LIVIN' IN A VAN DOWN BY THE RIVER EATIN' GOVERNMENT CHEESE!"
C O M M E N T S
Home » Topic: Cg: nVidious Plot or Graphical Revolution?

|«« - Previous Page - Next Page - »»|
P O S T   A   C O M M E N T

You need to be logged in to post a comment here. If you don't have an account yet, you can create one here. Registration is free.
C R A P T A G S
Simple formatting: [b]bold[/b], [i]italic[/i], [u]underline[/u]
Web Links: [url=www.mans.de]Cool Site[/url], [url]www.mans.de[/url]
Email Links: [email=some@email.com]Email me[/email], [email]some@email.com[/email]
Simple formatting: Quoted text: [quote]Yadda yadda[/quote]
Front Page (ATOM) • Submission Bin (2) • ArchivesUsersLoginCreate Account
You are currently not logged in.
There are currently 0 people browsing this site. [Details]