Jump to content

Fog and ATI


alexoscar

Recommended Posts

It strictly comes down to ATI's willingness to support fog-tables (or their 'emulation') under DirectX. ATI has decided that their DirectX drivers will not do this. With Radeons they support 'vertex fog' which is a completely different way of generating fog (it's generally more video-compute intensive for the videocard than fog-tables). This has been ATI's position for quite awhile regarding fog-tables for DirectX on the PC. Strangely fog-tables are supported for the Mac RAVE 3D API, which obviously indicates that it isn't a hardware incompatibility that prevents ATI from doing the same for DirectX. Vertex fog works OK for 3D shooters and limited horizons, but it makes for quite a performance hit when done for maps with much longer/larger horizons.

There will be no fog on ATI cards (on the PC under DirectX) within CM until the release of CMx2. No tweaker, 3rd party 'utility' or driver mod is really going to change this fact since they would have to modify the drivers themselves to actually add the support (additional code that most likely is not present in the drivers). I highly doubt any of the modified ATI drivers have been recoded to that level.

If guesses are correct about CMx2, it will be programmed in the OpenGL API and possibly use one or more methods to generate fog-effects that are compatible with ATI video chips along with other popular high performance video chips.

[ September 13, 2004, 08:53 AM: Message edited by: Schrullenhaft ]

Link to comment
Share on other sites

mx4000? dxx what where commas? You really need to start writing a little more precise, man.

A GeForece FX 5200 or 5200 LE is very slow, for less money you get a Ti 4400 or 4600 off ebay which is faster for most games. Some the of higher FX 5xx0s are OK.

DirectX 9 is required for some games and is supported by a FX 5200 but not by a Ti 4x00. But the 5200 will be too slow to run these games anyway, so you don't win much.

What CPU do you have?

Link to comment
Share on other sites

ok how about this:

Chaintech nVIDIA GeForce MX4000 Video Card, 128MB DDR, TV-Out, 8X AGP, Model "SHMX4000" -RETAIL

- Specifications -

Chipset/Core Speed: nVIDIA GeForce MX4000/275MHz

Memory/Effective Speed: 128MB DDR/400MHz

BUS: AGP 8x/4x

Ports: VGA Out(15 Pin D-Sub) + TV-Out (S-Video)

Support 3D API: DirectX 7, OpenGL 1.3

Max Resolution@32bit Color: 2048X1536

Cable/Accessories: Manual, Driver CD

Retail Box (See pics for details)

Model#: SHMX4000

Item#: N82E16814145090

FedEx Saver Shipping $0.99

or this:

AOpen nVIDIA GeForce FX 5500 Video Card, 128MB DDR, DVI/TV-Out, 8X AGP, Model "FX5500-DV128" -RETAIL

- Specifications -

Chipset: nVIDIA GeForce FX5500

Memory: 128MB DDR

BUS: AGP 4x/8x

Ports: VGA Out(15 Pin D-Sub)+TV-Out(S-Video)+DVI connector

Support 3D API: DirectX 9.0, OpenGL

Cable/Accessories: S-Video Cable, Driver CD

Max Resolution@32bit Color: 2048x1536@60Hz

Retail Box (See pics for details)

Model#: FX5500-DV128

Item#: N82E16814135141

Rating: Vote(s): 4 Review(s): 2

FedEx Saver Shipping $0.99

Link to comment
Share on other sites

Originally posted by Redwolf:

Interestingly enough, ATI seems to support fog tables in the OpenGL API, just not in DirectX.

Since Mac only uses OpenGL, that is why it works on Macs, but OpenGL on Windows seems to have it, too.

Wrong, in CM Mac's use Rave 3D, not OpenGL. If it did you would not see ten posts a month about why CM doesn't work in OSX.
Link to comment
Share on other sites

Originally posted by Panzerman:

</font><blockquote>quote:</font><hr />Originally posted by Redwolf:

Interestingly enough, ATI seems to support fog tables in the OpenGL API, just not in DirectX.

Since Mac only uses OpenGL, that is why it works on Macs, but OpenGL on Windows seems to have it, too.

Wrong, in CM Mac's use Rave 3D, not OpenGL. If it did you would not see ten posts a month about why CM doesn't work in OSX. </font>
Link to comment
Share on other sites

Originally posted by junk2drive:

I have an AMD Duron 1.2 on an ECS K7SEM motherboard.

dx9 = DirectX9 , sorry if I confused you.

Thank you for the quick reply.

I know dx9 = DirectX 9 but if you re-read your message you'll see some confusing grammar (worse than mine smile.gif ).

You really need to watch out for the AGP voltage. You mainboard may not support newer cards at all.

And you will not be able to feed anything faster than a Ti4400 or so, a faster graphics card will be idle because the processor can't keep up.

I am not familiar with a MX4000, but so far everything MX has been very disappoiting. Since you have slow main memory you really don't want half the memory bandwidth on your video card as well. Especially not on a 128 MB card.

Link to comment
Share on other sites

Originally posted by Panzerman:

I think there reason behind Mac's having it and Direct 3D not has something to do with the fact that for a while ATI's came with Mac's and so Apple put some requirements on the cards.

Like, demanding that the API only counts as implemented when it is completely implemented. Shocking. I wonder when Microsoft starts demanding that. But then, they themself would get sued most often if that was a requirement smile.gif

However, for OpenGL it is possible that they wouldn't be allowed to make the OpenGL compliance claim with ignored/missing features and the OpenGL consortium would smack them over (MS doesn't seem to care in the DirectX case).

And just in case you don't know: hardware-accelerated DirectX 2D (DirectDraw) is broken on ATI cards, too. Ask any ATF player.

CM has really taken the wrong path with its 3D APIs :(

Link to comment
Share on other sites

Exactly. The sad thing is that John Carmack of Doom/Quake fame and the leading developer for 3D game engines at the time, is and was praying the OpenGL prayer all the way back from 1996. Long before CMBO hit the shelfes (of the mailorder warehouse) OpenGL was a fully reliable and widely available API on consumer graphics cards. I certainly didn't run into much trouble shooting my way through Half-Life in OpenGL.

You shouldn't forget that CMBO/CMBB/CMAK don't use new DirectX variants, they are basically DirectX5 games, and vendors will start dropping Dx5 features all over the place.

While Microsoft doesn't kick out APIs or API elements like Apple did with Rave, Microsoft obviously expects vendors to implement all DirectX versions back to the stone age but doesn't do anything to enforce it.

All this in turn leads John Carmack to still advocate OpenGL - because it is not up to the mercy of a single man what schindluder vendors are allowed play with that API and what not.

Link to comment
Share on other sites

junk2drive - I'd recommend getting the GeForce FX 5500 between the two picks that you were looking at. The GeForce 4 MX4000 is just the latest name for the GeForce 4 MX 440 with AGP 8X support (I believe the clockspeeds are still generally the same too). With the FX 5500 you'll have a faster card and one that will possibly support the features of CMx2 (DirectX 9/OpenGL 1.4-1.5). Your ECS K7SEM motherboard should work fine since it supports AGP 4X, though you may want to check the revision of the motherboard which is hopefully printed somewhere on the surface (I believe that revision 3.0c has the CPU fixed to the motherboard and is non-upgradeable regarding the CPU). You may want to double check your power supply's amperage specification for 3.3V, since this will be taxed by a higher-end videocard.

Doing a Google for the K7SEM resulted in this page about AGP videocard support on the K7SEM. From what is written here, I'd guess that the FX 5500 should work if you have a good power supply (at least 18+ Amps on the 3.3V line, preferably much more).

As Redwolf pointed out though, you may not get the maximum speed out of the card with a slower CPU. However the FX 5500 should perform better than the 4 MX4000 with your Duron 1.2GHz. If you're willing to do the eBay route, you can find the GeForce 4 Ti 4400 or 4600 cards for US$75 - 100 (shipping and handling extra). The performance may be slightly better with the 4400/4600, but they won't support DirectX 9 or all of the OpenGL 1.4-1.5 calls.

Redwolf, et al - The problem was that development was done on the Mac and I'm not sure how good the OpenGL support was back in 1996 on the Mac when things began on CMBO. The other issue for the PC was that a number of cards didn't really have OpenGL support, just DirectX/Direct3D. And if the cards did support OpenGL in some manner it was an 'afterthought' and the emphasis was on DirectX support. Admittedly these weren't the greatest videocards, but there were a number of them. So while OpenGL is obviously a much more universal format now, driver support wasn't nearly as universal as it is now. When CMBO was created it wasn't necessary to have the latest and greatest hardware (which is what really supported OpenGL). Of course it is a bit different now, with the demand for higher fidelity graphics from many of CM's customers.

Link to comment
Share on other sites

Schrullenhaft

Thanks for the advice and the link. I have an early version, but will have to check the number on the board. I did'nt have any problem with generic PC133 ram or the Radeon.

The FX5500 above is only $68 at Newegg. If it will work with my mb, give me some improvement, give me fog, and work for CMx2 in my next build, that is what I am looking for. If it won't work I will just wait until I build. The Radeon works fine, just no fog.

Link to comment
Share on other sites

Originally posted by Schrullenhaft:

Redwolf, et al - The problem was that development was done on the Mac and I'm not sure how good the OpenGL support was back in 1996 on the Mac when things began on CMBO. The other issue for the PC was that a number of cards didn't really have OpenGL support, just DirectX/Direct3D. And if the cards did support OpenGL in some manner it was an 'afterthought' and the emphasis was on DirectX support. Admittedly these weren't the greatest videocards, but there were a number of them. So while OpenGL is obviously a much more universal format now, driver support wasn't nearly as universal as it is now. When CMBO was created it wasn't necessary to have the latest and greatest hardware (which is what really supported OpenGL). Of course it is a bit different now, with the demand for higher fidelity graphics from many of CM's customers.

I don't think that doesn't hold too much water. The 1996 release of Quake 1 was OpenGL only (apart from Glide) and certainly ran everywhere.

And even if that was not the case, or if the danger has been realized too late, a drastic change in 1998 or 1999 would have been a better solution than to put out a new game to the market using Rave and DirectX 5 - in late 2003.

There were also excellent (within their limits) softrenderers for OpenGL, supporting more than 640x480. Those would come in real handy now that many people have overpowered computers and would accept softmode to just answer a quick PBEM turn.

Link to comment
Share on other sites

Admittedly this discussion will go nowhere...

It's easy to point out where things should have changed in hindsight. I don't know what Charles' familiarity was with OpenGL at the time work on CMBO was started. By 1998/99 almost 2-3 years had been put into the development of CM, which is the work of just one programmer, not a team or a development based on a licensed engine. It's hard to throw away that much work on something that does work for something that may become more of an industry standard in the future.

At the time OpenGL was primarily considered the domain of 3D rendering programs, animation packages and other professional imaging tools. Quake GL was one of the earliest games to make OpenGL popular at a consumer level, but it also had a DirectX version (WinQuake) and of course the Glide API-based version so gamers weren't limited to just one API if their card didn't fully support it.

However DirectX was still a more popular API for game developers. OpenGL, thanks to id Software, had become the primary API for 3D shooters (which have gotten the most attention in mainstream games). But driver support hasn't always been perfect for this API either - lacking performance compared to DirectX drivers from some vendors or lacking 'extension' support, etc. for some implementations (MCD's or ICD's). Software-rendering of OpenGL can be horribly slow too (as it is with any software-based rendering of a 3D environment); slow enough to not consider it if too many people need to use it. It can make your product look pathetic performance-wise or increase the hardware requirements to run it to a point that it is too expensive for your target audience.

So there was a lot of imperfect info and guess-timates about what choices to make (as there almost always is with game development). DirectX and 3D RAVE seemed quite valid choices at the time, especially since these were the API's the OS developers pushed at the time of CMBO's creation. Of course DirectX is still pushed by Microsoft and it is quite capable compared to OpenGL in terms of features, but it obviously can't match OpenGL for cross-platform support.

Anyway, CMBO was created with the then dominant graphical APIs for each platform (which enjoyed more support at the time). The option for software-rendering was present in the first release since it was expected that a lot of CMBO's buyers may not have higher-end hardware to run the game on. Later versions had to do away with this feature for an interface and minimum resolution increase that was seen as necessary in the evolution of the game.

Releasing subsequent versions based on the same graphical engine was necessary in order to get out a product within a tolerable amount of time (for both the developer and the customer). Obviously this has caused problems with the Mac and the lack of 3D RAVE support within OS X. On the PC, which dominates CM sales by a very large margin, the old graphics engine was hardly an issue. The advanced lighting effects of newer DirectX versions probably could have been added, but at a significant cost in additional development time for only so much eye-candy (since there would likely be little game-engine effect until a rewrite of the engine). Admittedly it is an obnoxious choice to make - sacrifice the compatibility of a certain chunk of your playing public to remain in business or risk the entire business (and hope you have enough savings in the bank) and go for a much longer development cycle for your next product. For Steve and Charles the choice seemed clear, despite harping from customers for something much more advanced. If they had the convenience of not having to make a living off of what they do, then a lot could have been different (or quite possibly - never fully existed beyond concept papers, etc.).

I'm not really privy to any particular knowledge about the choices made by BFC and BTS. But I do hate it when they're belittled as short-sighted and ignorant in these forums. They've had to make business decisions that not all customers are pleased with, but they've been necessary in order to keep producing something to keep them in business - and eventually producing the product that everyone harps for (or at least most of the requests).

Oh, and as far as I'm (ignorantly) aware, the alpha-blending capability in CM is based on DirectX 6.x rather than 5.

[ September 14, 2004, 11:56 AM: Message edited by: Schrullenhaft ]

Link to comment
Share on other sites

I don't think of BFC as bad, they made a choice and stuck with it; which few people these days do. I tend to think of it as a "what if" type situation. smile.gif As far as I am aware OpenGL was just new to Mac's in the late 1990's so that would have been a problem. I know our G3 from 1998 had OpenGL as part of the install, but who can say how well it was compared to todays OpenGL.

Link to comment
Share on other sites

  • 6 months later...

This weekend I purchased a BFG FX5500 OC 256DDR.

I see fog for the first time.

I had a gift card for Best Buy and it seemed to be the best Nvidia for 4x, low requirements (250W minimum).

I have a 300W PS with 22A on the 3.3V

My K7SEM is version 1.01 and AGP 2.0 compliant.

There was an uninstall ATI listed in Add/Remove programs along with other ATI items. I navigated to my ATI folder and found the uninstaller. I used it instead of the Windows manner. That seemed to remove all traces of ATI.

The driver CD installed and set the default at 60hz. That caused a buzz in an AM radio nearby.

I started up a CMAK and it asked me for a preference at 85hz. (Remember the previous thread about my new monitor.) After making sure the game worked ok, I changed my display refresh rate to 85hz, the highest listed in Windows display properties. Radio buzz went away.

Now to see how CWBR runs.

Thanks for all the help guys.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...