Jump to content

Is this causing the problems with the 8xxx series?


Recommended Posts

Desertfox:

Unfortunately no I cannot. I do have an older computer with a gf6800gt but unfortunately its an AGP card and my current mobo doesnt have an AGP slot. And also that machine is running a linux system which I cannot shut down apart from a very short time.

One thing I'm seeing here is that my mobo is also using the Intel P965 chipset.

Link to comment
Share on other sites

  • Replies 100
  • Created
  • Last Reply

Top Posters In This Topic

Originally posted by Hotti:

Desertfox:

One thing I'm seeing here is that my mobo is also using the Intel P965 chipset.

Yeah, but it seems unlikely for me to be the problem since other users (rune by example) also have issues with 8800GTX and are using the NVIDIA 680i SLI chipset (ASUS StrikerExtreme).

Too bad that you canĀ“t perform that non 8800 test.

Link to comment
Share on other sites

I did a series of tests having the ability to attach performance graphs using the instrumented drivers to the Microsoft Management Console.

This is how I was setup (changed these values specifically for this test).

From the Nvidia control panel:

Anisotropic filtering: App. contr

Antialiasing Gamma: On

Antialiasing Mode: App. contr

Antialiasing Transparency: Off

Conformant texture clamp: Off

Error reporting: Off

Extension limit: Off

Force Mipmaps: None

Multi-display.... : Single Display.....

Texture filtering Anisotropic sample.. : Off

-"- Negative LOD bias: Allow

-"- Quality: High Performance

-"- Trilinear opt: Off

Threaded opt: Off

Triple Budd: Off

V-sync: Force off

I ran the game with:

jpg-name says model and texture detail level

Resolution: 1280x1024

Vsync: off

AA / MS: off

High priority process: On

Battle: Allahs Fist

you can find the results I got here:

http://www.saunalahti.fi/~hotti/cmsf/

The graphs are explaine under the graphs but the thick green line is the one measuring the FPS. I would say this is a more reliable way of measuring the FPS since you dont get fraps slowing down the computer.

First of all these graphs clearly show how just by lowering down the graphic detail every step makes it a bit better. I think biggest differences can be seen between fastest-faster and and balanced-improved. As you can see with fastest graphic details it gets more and more spikes with over 100 FPS.

What is sort of odd to me is the GPU_idle % which seems to be at 26% on best settings and then drops down to 16% at better and then steadily rising from there.

Also I noticed that if I scroll high up to the sky so the land disappears beneath me it doesnt affect FPS at all. Also if I turn the camera so that I can see the sun only I get a huge boost in FPS. Also viewing down towards the land so I cant see the landscape in the distance gets me huge boost of FPS (those spikes in many of the graphics settings I tested). Could it be possible to change the same "horizon image" that is a very low version in the fastest settings to be used also in all the other graphics details settings. If this will improve the game there is something wrong with handling the textures?

Link to comment
Share on other sites

bah.. stupid me.. Of course I can set the game up myself like that.. So I did do another little test which did give me a small surprise.

I ran the game again with all the same settings as in the previous mode except this time with these ingame settings:

Model Quality: Best

Texture Quality: Fastest

and

Model Quality: Fastest

Texture Quality: Best

I was not expecting high frame rates when running the game with best model quality but what surprised me was how low FPS I got when I had the lowest model quality but best texture quality. I added these results under the same link as in the previous post.

Link to comment
Share on other sites

Hotti,

If you get a sudden drop in FPS when you increase texture quality it would imply your card doesn't have enough memory to store all the textures and is having to use system RAM, which the card accesses much more slowly. Do you think this might be the case? Maybe the nVidia tool you are using will tell you.

Do please keep plodding on with your tests. There could be something really simple that BFC have missed - like drawing stuff that is outside the viewport for example. Charles may be a genius but he is just one person and everyone makes mistakes. That's why bigger development teams probably catch more problems before release.

Link to comment
Share on other sites

my GF8800GTS has 640mb of ram so I REALLY dont think thats the problem tongue.gif But yes the instrumented driver would also let me make a graph of the GPU memory usage. I was planning on running some extensive tests on the weekend.

And the more I poke around everywhere, do research on stuff (even though I'm really no expert in programming but I do know some very basics on some stuff) the more I get the feeling that this is something that has been overlooked on some new stuff on gf8xxx series cards. But also I get more and more puzzled since the drivers are built so that they support the newest hardware straight up and older cards are the ones that should be running to problems. Also another thing that is puzzling me is that I can't find any other programs that are having similar problems.

I have all the confidence in Charles' ability to program too but looking some very small error (since it cant be a big once because the game is working more or less) can be next to impossible deep in the core of the engine specially if you have no way to test if it will work eventually.

Link to comment
Share on other sites

Just thought I'd search in Google about software render mode for OpenGL to see if I could see if my card was doing it. In the process I read this little section on Wikipedia.

"In general, Direct3D is designed to be a 3D hardware interface. The feature set of Direct3D is derived from the feature set of what hardware provides. OpenGL, on the other hand, is designed to be a 3D rendering system that may be hardware accelerated. These two APIs are fundamentally designed under two separate modes of thought. The fact that the two APIs have become so similar in functionality shows how well hardware is converging into user functionality.

Even so, there are functional differences in how the two APIs work. Direct3D expects the application to manage hardware resources; OpenGL makes the implementation do it. This tradeoff for OpenGL decreases difficulty in developing for the API, while at the same time increasing the complexity of creating an implementation (or driver) that performs well. With Direct3D, the developer must manage hardware resources independently - however, the implementation is simpler, and developers have the flexibility to allocate resources in the most efficient way possible for their application.

Until recently, another functional difference between the APIs was the way they handled rendering to textures: the Direct3D method (SetRenderTarget()) is convenient, while previous versions of OpenGL required the manipulation of P-buffers (pixel buffers). This was cumbersome and risky: if the programmer's codepath was different from that anticipated by the driver manufacturer, the code would have fallen back to software rendering, causing a substantial performance drop. According to a Gamasutra article (registration required), the aforementioned John Carmack considered switching from OpenGL to Direct3D because of the contrived use of P-buffers. However, widespread support for the "frame buffer objects" extension, which provides an OpenGL equivalent of the Direct3D method, has successfully addressed this shortcoming."

So, is Charles using the "frame buffer objects" extension?

Link to comment
Share on other sites

Originally posted by Hotti:

I was not expecting high frame rates when running the game with best model quality but what surprised me was how low FPS I got when I had the lowest model quality but best texture quality. I added these results under the same link as in the previous post.

Guys, just so you know the 'better' and 'best' texture settings in 1.03 draw 2 textures per terrain type as opposed to 1. This is why these settings are showing a noticable slow down over lower settings. Setting your textures to 'Improved' in 1.03 is the same as setting it to 'Best' in 1.02.

Just wanted to clarify that so you know what is happening there.

Link to comment
Share on other sites

I just did a pile of different performance graphs with CMSF and the instrumentalised driver. I ran the game with "Better" settings on both model and texture quality so the GPU should have a whole lotta stuff happening.

The screenshots can be again found in the same place here:

http://www.saunalahti.fi/~hotti/cmsf/

just for reference I took a screenshot from the PerfSDK manual to show you where in the GPU pipeline these counters are located. The file is gpu_pipeline.jpg and as you can see the further down in the pipeline you go little or no activity is being reported.

Also the screens with *_GPU_* mean that they are raw data straight from the GPU showing how much activity is going on. The one with *_OGL_* means its info from the OpenGL driver.

What I find sort of odd is that even though nothing much seems to be happening in the hardware the GPU idle % is close to 0%.

Edit:

I made one more graph or actually its a collection of every counter that is measured in % to see what is going on in the big picture of the GPU. The file for that is GPU_performance_full_details.jpg. NOTE!: The MAX value in these graphs is 10%! So it looks like the GPU isn't even breaking a sweat doing all the stuff its doing.

[ September 07, 2007, 05:57 AM: Message edited by: Hotti ]

Link to comment
Share on other sites

I think I might have found something (again hehehe). I found this program calle glIntercept which basically intercepts all OpenGL commands issued by a program to a highly configurable output logfile. Well it took ages (smth like 1h 30min) to process just ONE frame ran by CM: SF and the output was a HUGE set of 3.5gig of stuff. I told the program to produce an XML file of the output which turned out to be 133meg in size and the rest were ALL the images etc applied in just that one frame. And the result was nothing very useful at first glance. (And it was very painful to try to analyze anything specially since just about anything other than scrolling the page would make firefox crash) Then I decided to head back to OpenGL.org forums to browse what I could find and ended up in the OpenGL wiki. I'm going to post some interesting stuff I found first and then something possibly more important.

If you want to find out if a feature is supported in hardware, there's no easy way right now - it'll silently go into software mode.

Once GL3 comes out, anything that isn't hardware-accelerated just won't be created - the object returned will be NULL, or some other indicator of failure. Until then, you just have to hope you don't hit a software path.

Source: http://www.opengl.org/discussion_boards/ubb/ultimatebb.php?ubb=get_topic;f=3;t=015382

So with current version of OpenGL its really hard when you hit a situation like we have now so the blame is not totally on bad nvidia drivers. OpenGL itself seems bad in this way.

When I got to the OpenGL wiki things got a bit interesting here:

http://www.opengl.org/wiki/index.php/Hardware_specifics:_NVidia

Following up that link about multicore I found this:

Guys, please take a look at this. For optimal performance try not to do many queries into the OpenGL driver for each frame. For example, glGetError() or glGetFloatv() etc.
And then it points here:

http://developer.nvidia.com/object/multi-thread-gdc-2006.html

And now back to the part about using glIntercept. I noticed one of the last commands in ONE frame was glGetError(). So looks like CM: SF is doing EXACTLY as you are told NOT to do; "For example, glGetError() or glGetFloatv() etc." It is suggested by nvidia when using multi-core systems not to have these commands in release builds. Or have you allready fixed this for 1.04?

Edit:

Actually, taking another look of the XML (and waiting 5 minutes for firefox to load it after changing the folders that hold all the images so they wont load) I found out that the whole set is absolutely filled with various glGet... commands in the beginning and in the end. I counted about 50 before giving up.

[ September 07, 2007, 12:11 PM: Message edited by: Hotti ]

Link to comment
Share on other sites

peleprodigy,

Wow man, you are en fuego with the bug tracking skils, though I am waiting for the obligatory...'this tells us nothing' post. don't be discouraged, keep it up.
An attitude like this is counter productive. If Hotti figures out what is going on I can tell you NOBODY would be happier to know about it than us. We are wasting a lot of time on this issue and would like nothing more than a shortcut to a happy conclusion for all. Bug tracking of this sort involves a lot of dead ends that at first looked promising. Hotti's initial discovery turned out to be a deadend, so we said as much because what else are we supposed to say?

Hotti, I have no way of telling if any of this hard work you're putting in will help at all. All I can say is we are extremely pleased that you are curious and skilled enough to put in time trying to find the problem. It is possible that none of your efforts will result in a fix any sooner, but it is also possible that one of these times you'll post something that will at least point us in the right direction (if not to the right spot!!).

Your latest posts are very interesting to me and I've made sure Charles has seen the meat of it. I'm sure he is already aware that the feedback from errors is sorely lacking. In fact, if you guys knew how bad the tools are to trouble shoot and control card settings you'd probably be amazed that anything can run at all. I know I am :D The second part of your post, which lists advice about specific calls, might help or it might not. For all I know CM doesn't use those calls or use them very much. But it is at least something to double check, just like we double checked your initial guess. The fact that it didn't find the problem is not important to us. Trying is all that matters because the more information we have the better our chances of fixing this sooner rather than later are.

Plus, if anybody other than Charles can find the cause of this problem it will probably be a Finn. Since Hotti is a Finn, that's a step in the right direction right there :D

Thanks!

Steve

[ September 07, 2007, 03:02 PM: Message edited by: Battlefront.com ]

Link to comment
Share on other sites

Ć¼berfinns ftw!

Aaaanyhow.. I hope my poking around doesn't seem like I don't have confidence in your ability to find the issue. Quite contrary I have all the faith that you if anyone is able to fix this. My intentions are to gather as much information I can when running the game with various loggers etc. I can find and then post that information and any other I might stumble on browsing the net. But as we seem to be stumbling blind in the dark here to kill that bug using a shotgun might be more efficient than trying to hit it with a sniper rifle (without night vision). Sure it most propably will take a few shots and I might even shoot myself in the leg but the odds are still bigger on hitting the bug :D

I found yet another opengl debugger which seems to be the most promising of all the ones I've used so far. It's called gDEBugger made by graphic remedy but unfortunately there is some error in the program running with cmsf so I cannot get it to work now. In their FAQ they told to email them with the errorlogs and so I did so now I'm hoping for some reply from them to get to the actual debugging since this program is not free, there is a 30-day trial and a full license would cost somewhere near $600. And it wouldnt do me good either since it doesnt work.

Link to comment
Share on other sites

Hi Hotti,

Aaaanyhow.. I hope my poking around doesn't seem like I don't have confidence in your ability to find the issue.
Not at all! If this were an easy thing to fix we would have had it locked up weeks ago, but it isn't. Therefore, all the help we can get we'll take. As you say, the more people trying to come up with a reason for this issue the greater the chance we'll find it. If you saw how many ideas Dan, Martin, Rune, and I have sent to Charles that turned out to be not at all helpful (on their own, at least) you'd know that I can sympathize with you in having your first guess shot down. Several times now we've thought "AH!! That's it!!!" only to find out it isn't. Frustrating to say the least!

Keep at it!

Steve

Link to comment
Share on other sites

Since it seems like I cannot run the gDEBugger on CMSF could anyone else try to see if you can get it running with CMSF?

just download the trial version from http://www.gremedy.com/ install the program, then go to the directory where you installed in for example c:\program files\graphic remedy\gDEBugger and in the folder named spies there is opengl32.dll, copy this to the cmsf installation folder, run gdebugger and make a new workspace for cmsf. then start the debugger and see if you can get it to start and run a scenario where you will get poor performance and see what kind of data does the debugger provide. I really am not sure what to expect at all but any info you can get from the program might be helpful. Oh and be sure to check where the program saves log files (you might want to change the directory for easier access).

If you cant get it working just remove the opengl32.dll from the cmsf directory and it wont start the debugger when you run cmsf.

Link to comment
Share on other sites

Originally posted by Hotti:

Since it seems like I cannot run the gDEBugger on CMSF could anyone else try to see if you can get it running with CMSF?

just download the trial version from http://www.gremedy.com/ install the program, then go to the directory where you installed in for example c:\program files\graphic remedy\gDEBugger and in the folder named spies there is opengl32.dll, copy this to the cmsf installation folder, run gdebugger and make a new workspace for cmsf. then start the debugger and see if you can get it to start and run a scenario where you will get poor performance and see what kind of data does the debugger provide. I really am not sure what to expect at all but any info you can get from the program might be helpful. Oh and be sure to check where the program saves log files (you might want to change the directory for easier access).

If you cant get it working just remove the opengl32.dll from the cmsf directory and it wont start the debugger when you run cmsf.

Did the above, and the game would not run. Removed the opengl32.dll file, game would run.

One thing I did notice was the version number of the opengl32.dll that is in my windows directory are different than what comes with the the debugger.

WinXP SP2 opengl32.dll version: 5.1.2600.2180

Debugger open32.dll version: 3.1.1.5110

Have no idea if that means anything.

Link to comment
Share on other sites

the debugger opengl32.dll is meant to intercept opengl communication and the version 3.1.1.5110 is the version of gDEBugger, version 3.1.1 build 5110. okay looks like its a more general error in the gDEBugger than just me. I sent them the sample as requested in their FAQ in the website. Lets hope there could be a quick solution for this so we could see if this program would give us any more info than the earlier ones I've tried.

Link to comment
Share on other sites

Hi again. I did some tests with this program called nvHardPage I found from guru3d.com and it allowed me to have control on the nvidia driver more deeply than form nvidia control panel. For example I could force the opengl driver to work in an emulate mode for any existing Nvidia hardware architechture and guess what. Even though I put the driver to emulate only OpenGL 1.2 (since the game wouldnt work on 1.0 or 1.1 which tells me this emulation actually works) I got all the same very bad FPS as ever (these settings would have the card work like an old FX 5800 or so series card). Also crancking up the the settings to the very max caused very little slowing down on the even now bad FPS.

Link to comment
Share on other sites

Now, if any of you nVidia folks have multiple monitors, that could be part of the problem. This "fix" was found that may help you:

Anyone using Nvidia + multi screens and having very slow performance. You need to change a setting in the Nvidai control panel:

3D Settings => Manage 3D Settings => Multi-display / mixed-GPU acceleration = Single display performance mode.

This will fix performance for all OpenGL based games if you have multi screens.

If this helps out anybody, please let us know.

Steve

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...