Jump to content

Nvidia users, immediate testing required


Recommended Posts

  • Replies 217
  • Created
  • Last Reply

Top Posters In This Topic

TheDesertFox,

I really hope BF.C will manage to get Nvidia convinced that CMSFs obvious inability to use state of the art hardware is due to forceware driver issues.
This is what one OpenGL developer said about the issue last month:

"Well, to be honest, it makes OpenGL (or at least its NVidia implementation, don't know how ATI compares) absolutely unusable in practise until it's fixed if you're trying to release a game with any modern/next-gen effects."

There are no hard data which proove that the forceware drivers are the problem with CMSF, right ? Go ahead and find some!
That's what we're trying to do now, but even then... if they don't want to fix it they won't. We gave ATI two OpenGL bugs in their drivers to them on a silver platter and a third one on a bronze plate. We got a good old corporate run around and ball dropping with their development program, and surprise surprise, no fix from them yet.

However from a customers point of view all this fuzz might tell you a lot and you certainly can learn some lessons from it: Never, ever buy the cat in the bag when it comes to an advertised software product.
I don't disagree with this. The unexpected hardware difficulties with the 8800 were not a nice surprise for anybody.

peleprodigy,

I guess I hadn't heard that the 8800 was a defective product until I came across Combat Mission. Go figure.
You got lucky with the other games, unlucky with CM. Other 8800 users got lucky with all of them. Go figure.

Steve

Link to comment
Share on other sites

Yet another possible cause for problems with the 8800 series...

20 amp power supplies not sufficient. Need 25 amps

A quote from a CNET review of the card itself:

To power a single GeForce 8800 GTX card, Nvidia recommends a 450-watt power supply in a PC with a high-end dual-core chip and a typical combination of internal hardware. But the trick is that the power supply must have two PCI Express card power connectors to plug into the two sockets on the back of the card. Most modern power supplies should have the necessary connectors. If you want to add two 8800 GTX cards in an SLI configuration, however, you've got a challenge on your hands.

Nvidia hasn't released a driver that will run the GeForce 8800 GTX in SLI mode as of the time of this writing, but it may have one out soon. Thus, we didn't get to test it, but Nvidia did share the power supply specs with us. To run two GeForce 8800 GTX cards in SLI mode, Nvidia recommends at least a 750-watt power supply. But some of the recommended models on its SLI compatibility list go as high as 850 and even 1,000 watts. We suspect those higher-wattage recommendation will allow you some headroom for adding multiple hard drives and optical drivers, as well as very high-end quad-core processors. Still, it's clear that building a next-gen SLI rig will be no small undertaking, at least for now. Heck, many midtowers PC cases are too small to accept a 1,000-watt power supply.

So if any of you bought a card and stuck it into an existing PC without checking the power specs, I suggest checking things now.

As I said earlier, the problem with "slow framerate" is that there can be many different reasons for this yet no way to tell what the cause is. We've seen plenty of examples of this being true with the 7xxx and 8800s already. So this fix might help nobody, but it might help one or two people. It's impossible for me to say, so I just pass along each find and it either helps or doesn't.

Steve

[ September 13, 2007, 12:05 AM: Message edited by: Battlefront.com ]

Link to comment
Share on other sites

For a customer standpoint of view the fact is simple.

Did anyone, who complains about the game, try the demo?

Did the demo run smoothly on the pc of his?

And in case that the demo ran smoothly, why didn't the game do the same?

Battlefront.com encourages you to review the product Demo before making a purchase because ALL SALES ARE FINAL

My game v1.02 now runs on an ATI card just the same way as the demo does, but v1.03 is crappy. So I am not complaining at all.

[ September 13, 2007, 12:18 AM: Message edited by: Knaust1 ]

Link to comment
Share on other sites

Knaust,

We have no idea why the v1.03 game runs slower on your computer than v1.02, but we can say that this is not at all related to the issues being discussed here. These people have had problems with all previous version of CM, from v1.0 through v1.03. If they tried the v1.01 or v1.02 demo I'm sure the speed would have been equally bad for such people as the full version.

Version 1.04 might help you out some with your problem, might also help some 8800 people out too.

Steve

Link to comment
Share on other sites

Originally posted by Battlefront.com:

Knaust,

We have no idea why the v1.03 game runs slower on your computer than v1.02, but we can say that this is not at all related to the issues being discussed here. These people have had problems with all previous version of CM, from v1.0 through v1.03. If they tried the v1.01 or v1.02 demo I'm sure the speed would have been equally bad for such people as the full version.

Version 1.04 might help you out some with your problem, might also help some 8800 people out too.

Steve

OK, that's fully correct!
Link to comment
Share on other sites

Calm down guys! Although I've had my doubts about if the problem would actually be in the game code but in the last 2 days I've been convinced that the problem really lies in the Nvidia drivers. IF you would take the time to read (and understand) the release notes of the new driver it states that they have done some fixes to the memory managing of

"GeForce 8 series GPUs running DirectX 9 applications in single-GPU and NVIDIA SLI configurations."

They are only mentioning DirectX 9. Making tweaks on the directx driver does not help opengl programs since they are two completely different things!

Another thing is the article in The Inquirerer. It states that the 8xxx series cards have trouble with memory management:

The problem lies in fact that the G80 GPU has some trouble with texture memory management. In many unrelated cases, this bug starts leaving textures in video memory and system memory, causing the board to run out of texture memory. Once you run out of memory for textures on your graphics card and system, you will experience a lot of swapping with textures on a hard drive, and that is recipe for disaster, at least as far as framerates are concerned.
The reason why ALT-TABbin helps for a few seconds is because it FORCES the texture memory to empty. When you load back the game the bug starts again the memory fills up very quickly.

So you think other games should be influenced by this too? Lets do some quick math:

CMSF Mapsize: 4km x 4km

Tile size (if I understood right): 8m x 8m

Here we get a tilecount of 500 * 500 in a huge map. That would come to 250 000 tiles in the map and that would equial 250 000 textures in tiles only. If you use settings better than "Improved" it uses 2 textures per tile. That brings us to 500 000 textured tiles in the map only in the landscape. Then you add trees, rocks, grass, units etc etc it would require a MASSIVE amount of textures to be loaded (of course not EVERY tile uses a different texture but still) in to your gfx card memory and if the memory management is screwed up as the case seems to be comparing CMSF to any other OPENGL game is pointless since I haven't seen any other game that would use this amount of textures at one time.

What we need is for nvidia to update the OPENGL driver too! What we need to do is start filling a bugreport to nvidia about running CM: SF on the gf8xxx series and make sure you include that it uses OpenGL.

[ September 13, 2007, 12:42 AM: Message edited by: Hotti ]

Link to comment
Share on other sites

Originally posted by Battlefront.com:

That's what we're trying to do now, but even then... if they don't want to fix it they won't. We gave ATI two OpenGL bugs in their drivers to them on a silver platter and a third one on a bronze plate. We got a good old corporate run around and ball dropping with their development program, and surprise surprise, no fix from them yet.

Steve

Steve,

you certainly are aware of the importance of this special point for your future products which use the CMSF engine, are you? I mean with more and more customers in the next 2 or so years switching to family members of the 8800s and follow up products or the newer ATIs you are in absolute need to get your product up to speed on this hardware, this way or the other. If you don´t do that, you end up unnecessarily restricting your future customer base to users with aged hardware.

cheers

Helge

Link to comment
Share on other sites

Hotti,

This is still our lead suspect, but we're still not sure. It's very hard to pin this down to just one thing. Also, your texture math is wrong :D A texture is loaded once and can be used any number of times. But your point is still accurate in that we load a LOT of stuff into memory and therefore any memory problems with the card are quickly compounded.

The DesertFox,

Oh tell me about it :( Why do you think I'm still here at my computer at 04:00 and have to get up in about 6 hours? As I said, nobody wants to find a fix for the 8800 people more than us. Which is why peleprodigy's constant sniping and counter productive attitude is all the more annoying. We know there is a problem and we're trying to figure out how to fix it.

Steve

Link to comment
Share on other sites

First off, the Alt-Tab thing doesn't work on my PC. It doesn't get me back to the desktop. However, pressing the Windows key does - why is this?

Secondly, I see a FPS increase when I Windows key out of the game and then go back in, but my card is a Geforce 7600GT. So, as others have said, let's not keep assuming only 8800 cards are affected - they most definitely aren't.

Thirdly, I have to dispute Hotti's analysis of the number of textures loaded into memory. If you want to max out the texture memory requirements for the game, just build a test map that has every possible type of tile and flavour object on it. It shouldn't matter how many tiles of each type are on the map as the game will reuse the same texture. What might matter is if a particular texture is a bigger file than all the others and is used many times in the map, as the GPU would be moving around a lot more data for that one texture than all the others. Maybe grass textures need to be optimized?

[Edit]Steve beat me too it regarding the texture math! Crossed posts.

Link to comment
Share on other sites

Oh I just remembered one thing. About all those extensive tests I was running a while back there was one thing. All those graphs I did and all the different data I was able to get using the instrumented driver from nvidia ONE thing was reported to NOT function. And that was the VRAM usage statistics. And if I remember correct this "feature" of not being able to monitor the VRAM usage was reported on all the cards we are having trouble with now. So even the tools provided by the manufacturer of the core of these cards seems to be affected by the problems with memory management. How can you expect BF.C NOT to be affected by it? I believe that this memory management issue is the single biggest reason for most of us having problems with the 8xxx series cards and maybe even with the high-end 7xxx series cards too. Unless nvidia fixes this (which apparently is not ceratain at all!!) there is nothing we or BF.C can do about this. If you understand the way API's work even a little this simple pipeline shows how the thing goes.

CM: SF communicates with -> OpenGL API

OpenGL API communicates with -> Nvidia (OpenGL)Drivers

Nvidia (OpenGL) Drivers communicate with -> Nvidia Hardware

And as the problem seems to be in the last part of the "pipeline" only Nvidia can fix this.

Link to comment
Share on other sites

The grass is definitely a major FPS killer, but when you reduce Graphics Quality the amount of grass "doodads" drawn drops off dramatically. This is one reason why Best and Better settings are so hard on systems, in addition to drawing 2 ground textures instead of 1.

I've said it many times before, though... we're sure we are not looking at one single problem. The GPU/VRAM thing is probably a part of it, but we do not think it explains everything.

The one exception to Hotti's otherwise accurate description of how thigns are SUPPOSED to work is that if we can identify where the driver/card is failing to execute the API correctly we can potentially work around the problem. We did this with the ATI Left-Click bugs. This doesn't mean the workaround is great, but it can often be better than the alternative. For ATI users the alternative is crashing :(

Steve

Link to comment
Share on other sites

Yes I know the maths were overly simplified and not accuarte (tried to make that point by saying that not every tile uses a different texture).

Another interesting idea just occured. Could it be possible to make a HUGE map with the minimum amount of textures as possible? And also making a tiny map with as insane amount of textures as possible? and maybe some variations in between. These would definitely help us determine if the main reason we are getting poor FPS is because of the textures.

Darn, I was supposed to be back in school 5 minutes ago gotta run!

"I would hate you if I wouldn't love you...."

Link to comment
Share on other sites

Originally posted by Hotti:

CM: SF communicates with -> OpenGL API

OpenGL API communicates with -> Nvidia (OpenGL)Drivers

Nvidia (OpenGL) Drivers communicate with -> Nvidia Hardware

A question out of pure curiosity, which might help to better understand this tech stuff....

How do you think other existing OpenGL applications circumvent this central problem?

I mean CMSF is not the only OpenGL app on this planet that handles large amounts of texture data. Some fail and are affected by this Nvidia memory leak bug, and other apps function perfectly well and are capable to toss around data perfectly. What are the differences between those that fail and those that succeed ? Any ideas ?

cheers

Link to comment
Share on other sites

The DesertFox,

I mean CMSF is not the only OpenGL app on this planet that handles large amounts of texture data. Some fail and are affected by this Nvidia memory leak bug, and other apps function perfectly well and are capable to toss around data perfectly. What are the differences between those that fail and those that succeed ? Any ideas ?
It's like any complex system... it all depends on what you're trying to do and how you go about doing it. The ATI left-click problem is a very relevant example. "I have lots of games and none have this problem, only CM". Then we see the list of games and first off... most are DirectX. So in fact they really only have, say, one other OpenGL game and it's a first person shooter. The clicking that we do requires translations of 2d clicks into 3d coordinates. Quake and other games do not do that. Their mouse stuff intereacts with 2D interface and therefore never calls the same things we call. Hence why CM crashed and something like Half Life didn't.

If the GPU is processing things wrongly into VRAM it could be because it doesn't like a certain pattern of things handed to it. If Game A uses a different pattern, perhaps the GPU handles it correctly while mishandling Game B. It's like any bug... it's outcome can be very unpredictable from a end user standpoint.

Look at CM for example... the current bug that has a unit getting stuck sometimes. But to the user there is no difference between the time it gets stuck and the times it doesn't. So why don't all units get stuck all the time instead of just a very small number? Well, from the game's standpoint there is a difference, but it just isn't one that is obvious.

Remember that CM in general does things that other games don't do, and doesn't do things other games do. It's just the nature of CM and that puts it into its own league. Also keep in mind that card companies tend to favor the way FPS games work, and they work differently than CM. Therefore, the reason why FPS games all tend to work is they all tend to use the same features the drivers offer in the same ways. Even still, nVidia had to put out a special driver to run Quake correctly. It just shows how "touchy" the drivers are even for what can be argued as the BIGGEST OpenGL game in existance.

Steve

Link to comment
Share on other sites

Oh, and let's not forget... if John Carmak has a problem with an nVidia driver, he gets listened to. If they don't, well, the executives would watch their nVidia stock go down the toilette. nVidia probably doesn't even know we exist and if they did we certainly wouldn't get the same sort of service a big guy would.

Meaning, when other games have problems with drivers the drivers are fixed either before the game's release or shortly there after so that the gamer doesn't notice that there was a problem in the first place. We have no such clout.

Steve

Link to comment
Share on other sites

Steve,

thanks for your answer. Things are getting a little bit clearer now.

Originally posted by Battlefront.com:

Meaning, when other games have problems with drivers the drivers are fixed either before the game's release or shortly there after so that the gamer doesn't notice that there was a problem in the first place. We have no such clout.

Steve

Yep, that´s how business works. However the problem remains and I guess the question we are all interested to hear the answer to is:

what do you plan to do for your affected customers if the "worst case" scenario happens and, assuming Nvidias drivers are the culprit, Nvidia decides to ignore you ?

cheers

Link to comment
Share on other sites

Originally posted by The DesertFox:

what do you plan to do for your affected customers if the "worst case" scenario happens and, assuming Nvidias drivers are the culprit, Nvidia decides to ignore you ?

Give Charles a crash course in DirectX?

Seriously though, I think most game developers choose to use DirectX because then they aren't responsible for hardware issues. Microsoft and nVidia have to sort it out.

Link to comment
Share on other sites

In my opinion a good thing for all us end-users to do is go to the nvidia website and post a bug about their drivers. When they get 1 bugreport from Charles they propably ignore that to the bottom of the to-CHECK list. But if they get 100 bugreports from us end-users they might consider taking a quicker and closer look on the issue. Just don't start spamming their bugreport system or they might do the opposite. We must make nvidia know that there are many of us who are having this problem and ignoring it would be bad for business.

Here is a link to a page from where you can report a bug:

http://www.nvidia.com/object/vistaqualityassurance.html

I know its for "Vista driver" but hey, the problem is in vista too! Just make sure you include that you have the same problem if you're running XP too.

[ September 13, 2007, 04:48 AM: Message edited by: Hotti ]

Link to comment
Share on other sites

Originally posted by Battlefront.com:

Cameroon,

It would not surprise me to find out that whatever the problem is with G80 is found in the G70. Hmmm... actually, do you have Vista? From what I was reading earlier today Vista really brings out the worst in this problem.

Steve

Nope, not Vista - WinXP SP2. For awhile I was using the 94.24 drivers since they gave the best performance (going to try them again tonight if I can), but the 163.67 drivers appear to work about as well for me. Inbetween versions suffered from massive FPS penalties.
Link to comment
Share on other sites

Originally posted by Cpl Steiner:

</font><blockquote>quote:</font><hr />Originally posted by The DesertFox:

what do you plan to do for your affected customers if the "worst case" scenario happens and, assuming Nvidias drivers are the culprit, Nvidia decides to ignore you ?

Give Charles a crash course in DirectX?

Seriously though, I think most game developers choose to use DirectX because then they aren't responsible for hardware issues. Microsoft and nVidia have to sort it out. </font>

Link to comment
Share on other sites

Originally posted by Melnibone:

</font><blockquote>quote:</font><hr />Originally posted by Cpl Steiner:

First off, the Alt-Tab thing doesn't work on my PC. It doesn't get me back to the desktop. However, pressing the Windows key does - why is this?

Ditto - I'm almost sure alt-tab worked in 1.01 for me though. No biggie as windows-D takes me to the desktop anyway </font>
Link to comment
Share on other sites

Originally posted by Hotti:

Here is a link to a page from where you can report a bug:

http://www.nvidia.com/object/vistaqualityassurance.html

I did that and included the notion that my 8800 GTS (640MB) scores half of the FPS (8-9FPS) that my 7950GT (512MB) scores (15-17FPS) under identical settings both at 16x12 in Vista32 and Vista64.

Maybe this is helping to gain some momentum to the matter in question.

cheers

Link to comment
Share on other sites

Originally posted by Battlefront.com:

Yet another possible cause for problems with the 8800 series...

[url=http://forums.nvidia.com/lofiversion/index.php?t44480.html]20 amp power supplies not sufficient. Need 25

So if any of you bought a card and stuck it into an existing PC without checking the power specs, I suggest checking things now.

I have a brand new SeaSonic PSU that is more than sufficient.
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...