Jump to content

Geforce 8800 GTS 512 rendering problem


Recommended Posts

I am hereby bringing up an old topic of mine in the hope that someone has a fix or that Schrullenhaft has some spare time ;).

Basically, upgrading from geforce 6xxx or 7xxx series of cards to 8xxx (Gigabyte 8800 GTS 640 OC) was not a good choice with CMAK or CMBB in mind. The rendering of the graphics is ugly compared to the older cards. What I want to know is if this is 'fait accompli' or if there is a fix for it. I have been fiddling around a great deal with all different Nvidia Ctrl panel settings.

I was in contact with Nvidia support, but not to surprisingly the help consisted of "do you use the latest driver?", to which I can respond: I have tried EVERY driver in the universe, currently sticking with the newest one. They said they were gonna bring it forward to second level, but I have my doubts.

My own suspicion has lead me to mull over these facts, quoted from a tech review of a 8800 card:

Texture filtering now is nearly angle independent with the 8800 series, and steps away from the “bilinear” filtering between the mipmap levels that was less performance intensive and was used for earlier generations of products from both ATI and NVIDIA (...)

Basically it cleans up the image in 3D applications to a great degree. In games such as World of Warcraft, with large areas of repeating textures, the high quality filtering smooths everything out without blurring the textures. Scenes using HQ filtering with 16X anisotropy are about as pristine as one can get.

and

It also introduces a new concept with CSAA. Coverage Sample AA is something of a confusing concept, but one that when explained makes a lot of sense. Basically standard anti-aliasing takes color, geometry, and texture samples for points within a single pixel’s area, and then averages those values into a single pixel color. When CSAA is enabled, it basically increases the coverage sampling values without increasing the color, texture, and geometry samples. This allows more work to be done on a pixel’s final output without increasing the sampling rate (and decreases memory bandwidth utilization). It essentially is more of a weighted average of a pixel’s color based on more values taken from the samples used in 4X MSAA.

Not really being someone to wholly understand this, I can nevertheless deduct that something has happened with the rendering.

Even though I suspect that the crowd who purchases and play the CMX1 games are a dwindling number, as long as Battlefront.com supports and sell these games, this information should be documented for general knowledge. I am sure others than me would be disappointed if CMAK had the rough look I am experiencing with 16XAA and 16Aniso enabled.

Thanks beforehand

Link to comment
Share on other sites

As I suspected, I got an answer from Nvidia tech.

Hi Gustav,

The case was escalated to me, apologize for the delay in getting back to you. I had to discuss this with some of our engineers. As I suspected there is known AA compatibility issue with battlefront games with the newer AA implementation. In fact even the older cards and drivers had issues where text were missing, rendering corruption, and performance hit when AA is enabled. We were able to work around most of those issues but the new generations and new method of AA will not work. The older GPU uses completely different AA implementations from the newer GPUs and this newer implementation may have problems with older games. In fact we disable AA for many games that are known to have compatibility issue with our AA implementation, we recommend user to use the in-game AA instead. But for older games most did not include in-game AA options so in that case I'm afraid there is no way to use AA. Most of the newer games offer their own in-game AA and we give user the option to use NVIDIA's AA or in-game AA. But if there is a known compatibility issue with our AA and the game then we will disable AA from our driver when ever we detect the game. Another word, you couldn't enable NVIDIA AA even if you wanted to since it's disabled in the driver. Even if you can enable AA, it will not work right and likely result in corruption. We prevent AA from working to avoid the bad user experience. I'm afraid these older games is just not compatible with our current AA implementation.

Good to finally know at last!

Link to comment
Share on other sites

Sorry I didn't reply to this thread earlier. I was looking into the texture filtering and anisotropic filtering details hoping to find an answer. I didn't really expect it to be fully related to AA, but rather more filtering oriented (since AA basically affects 'edges', while filtering affects the whole texture and/or transitions between mip-map levels). I was also surprised that you got a direct answer from Nvidia.

I had heard of changes between the previous 7xxx/6xxx generation of video cards and the newer 8xxx/9xxx generation. I knew that there were differences in the way that they worked, but I wasn't sure how big those differences were and how the drivers treated them.

I'm not sure which games have built-in AA routines. Such an approach takes a bit more development time (which is definitely something out of our reach, even with new games) and it usually has a significant detrimental impact on graphics performance. To my knowledge it essentially involves a software method of super-sampling the 3D/texture video data, which multiplies the amount of data send to the video card. There would be no optimizations that the hardware or drivers make for such routines and thus it would hit video performance significantly. Typically most developers just have controls for AA and AF (which use the driver's and hardware's capabilities). I guess larger game developers may code up their own software routines for AA and possibly AF, but that sounds like a lot of additional work that would only be used on those occasions when there are compatibility or quality issues with the current driver.

I didn't know that Nvidia would "hard code" disabling features for certain games in their drivers. In the Nvidia Control Panel's 3D Management you can specify the 3D settings for each program and Nvidia will have presets for a large number of programs and games. I haven't seen BFC's games in this preset list in the past though. I wonder if the 'presets' are what this support engineer was talking about or if they actually hard coded some profiles that can't be changed ? If they did hard code the changes, then you could potentially get around them by renaming the executable (if the 'identification system' were that simple).

Link to comment
Share on other sites

Hehe, renamed the executable, but no, it didn´t work. A good idea though. I wonder if one could install a geforce driver for the earlier series? Or could that be hazardous for the GPU?

Anyway, thanks for your effort!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...