Jump to content

Sekra

Members
  • Posts

    122
  • Joined

  • Last visited

Everything posted by Sekra

  1. In my thread about this problem I explained how I experimented with the opengl drivers forcing them to emulate opengl version 1.2 (since with version 1.0 and 1.1 cmsf wouldnt run) I got no difference in performance (basically making it look like I have a nvidia gf fx5800 or smth like that) so this might be some sort of hardware issue more than related to drivers. Currently I'm thinking this is related to nvidia realising the rising amount of multi-core processors and being ready to support multi-threading with their newest gfx cards. [ September 10, 2007, 09:16 AM: Message edited by: Hotti ]
  2. Hi again. I did some tests with this program called nvHardPage I found from guru3d.com and it allowed me to have control on the nvidia driver more deeply than form nvidia control panel. For example I could force the opengl driver to work in an emulate mode for any existing Nvidia hardware architechture and guess what. Even though I put the driver to emulate only OpenGL 1.2 (since the game wouldnt work on 1.0 or 1.1 which tells me this emulation actually works) I got all the same very bad FPS as ever (these settings would have the card work like an old FX 5800 or so series card). Also crancking up the the settings to the very max caused very little slowing down on the even now bad FPS.
  3. the debugger opengl32.dll is meant to intercept opengl communication and the version 3.1.1.5110 is the version of gDEBugger, version 3.1.1 build 5110. okay looks like its a more general error in the gDEBugger than just me. I sent them the sample as requested in their FAQ in the website. Lets hope there could be a quick solution for this so we could see if this program would give us any more info than the earlier ones I've tried.
  4. Since it seems like I cannot run the gDEBugger on CMSF could anyone else try to see if you can get it running with CMSF? just download the trial version from http://www.gremedy.com/ install the program, then go to the directory where you installed in for example c:\program files\graphic remedy\gDEBugger and in the folder named spies there is opengl32.dll, copy this to the cmsf installation folder, run gdebugger and make a new workspace for cmsf. then start the debugger and see if you can get it to start and run a scenario where you will get poor performance and see what kind of data does the debugger provide. I really am not sure what to expect at all but any info you can get from the program might be helpful. Oh and be sure to check where the program saves log files (you might want to change the directory for easier access). If you cant get it working just remove the opengl32.dll from the cmsf directory and it wont start the debugger when you run cmsf.
  5. überfinns ftw! Aaaanyhow.. I hope my poking around doesn't seem like I don't have confidence in your ability to find the issue. Quite contrary I have all the faith that you if anyone is able to fix this. My intentions are to gather as much information I can when running the game with various loggers etc. I can find and then post that information and any other I might stumble on browsing the net. But as we seem to be stumbling blind in the dark here to kill that bug using a shotgun might be more efficient than trying to hit it with a sniper rifle (without night vision). Sure it most propably will take a few shots and I might even shoot myself in the leg but the odds are still bigger on hitting the bug I found yet another opengl debugger which seems to be the most promising of all the ones I've used so far. It's called gDEBugger made by graphic remedy but unfortunately there is some error in the program running with cmsf so I cannot get it to work now. In their FAQ they told to email them with the errorlogs and so I did so now I'm hoping for some reply from them to get to the actual debugging since this program is not free, there is a 30-day trial and a full license would cost somewhere near $600. And it wouldnt do me good either since it doesnt work.
  6. I think I might have found something (again hehehe). I found this program calle glIntercept which basically intercepts all OpenGL commands issued by a program to a highly configurable output logfile. Well it took ages (smth like 1h 30min) to process just ONE frame ran by CM: SF and the output was a HUGE set of 3.5gig of stuff. I told the program to produce an XML file of the output which turned out to be 133meg in size and the rest were ALL the images etc applied in just that one frame. And the result was nothing very useful at first glance. (And it was very painful to try to analyze anything specially since just about anything other than scrolling the page would make firefox crash) Then I decided to head back to OpenGL.org forums to browse what I could find and ended up in the OpenGL wiki. I'm going to post some interesting stuff I found first and then something possibly more important. Source: http://www.opengl.org/discussion_boards/ubb/ultimatebb.php?ubb=get_topic;f=3;t=015382 So with current version of OpenGL its really hard when you hit a situation like we have now so the blame is not totally on bad nvidia drivers. OpenGL itself seems bad in this way. When I got to the OpenGL wiki things got a bit interesting here: http://www.opengl.org/wiki/index.php/Hardware_specifics:_NVidia Following up that link about multicore I found this: And then it points here: http://developer.nvidia.com/object/multi-thread-gdc-2006.html And now back to the part about using glIntercept. I noticed one of the last commands in ONE frame was glGetError(). So looks like CM: SF is doing EXACTLY as you are told NOT to do; "For example, glGetError() or glGetFloatv() etc." It is suggested by nvidia when using multi-core systems not to have these commands in release builds. Or have you allready fixed this for 1.04? Edit: Actually, taking another look of the XML (and waiting 5 minutes for firefox to load it after changing the folders that hold all the images so they wont load) I found out that the whole set is absolutely filled with various glGet... commands in the beginning and in the end. I counted about 50 before giving up. [ September 07, 2007, 12:11 PM: Message edited by: Hotti ]
  7. I just did a pile of different performance graphs with CMSF and the instrumentalised driver. I ran the game with "Better" settings on both model and texture quality so the GPU should have a whole lotta stuff happening. The screenshots can be again found in the same place here: http://www.saunalahti.fi/~hotti/cmsf/ just for reference I took a screenshot from the PerfSDK manual to show you where in the GPU pipeline these counters are located. The file is gpu_pipeline.jpg and as you can see the further down in the pipeline you go little or no activity is being reported. Also the screens with *_GPU_* mean that they are raw data straight from the GPU showing how much activity is going on. The one with *_OGL_* means its info from the OpenGL driver. What I find sort of odd is that even though nothing much seems to be happening in the hardware the GPU idle % is close to 0%. Edit: I made one more graph or actually its a collection of every counter that is measured in % to see what is going on in the big picture of the GPU. The file for that is GPU_performance_full_details.jpg. NOTE!: The MAX value in these graphs is 10%! So it looks like the GPU isn't even breaking a sweat doing all the stuff its doing. [ September 07, 2007, 05:57 AM: Message edited by: Hotti ]
  8. this is not a problem with older nvidia cards. this seems to be affecting only the new 8xxx series cards. Check this thread topic
  9. my GF8800GTS has 640mb of ram so I REALLY dont think thats the problem But yes the instrumented driver would also let me make a graph of the GPU memory usage. I was planning on running some extensive tests on the weekend. And the more I poke around everywhere, do research on stuff (even though I'm really no expert in programming but I do know some very basics on some stuff) the more I get the feeling that this is something that has been overlooked on some new stuff on gf8xxx series cards. But also I get more and more puzzled since the drivers are built so that they support the newest hardware straight up and older cards are the ones that should be running to problems. Also another thing that is puzzling me is that I can't find any other programs that are having similar problems. I have all the confidence in Charles' ability to program too but looking some very small error (since it cant be a big once because the game is working more or less) can be next to impossible deep in the core of the engine specially if you have no way to test if it will work eventually.
  10. bah.. stupid me.. Of course I can set the game up myself like that.. So I did do another little test which did give me a small surprise. I ran the game again with all the same settings as in the previous mode except this time with these ingame settings: Model Quality: Best Texture Quality: Fastest and Model Quality: Fastest Texture Quality: Best I was not expecting high frame rates when running the game with best model quality but what surprised me was how low FPS I got when I had the lowest model quality but best texture quality. I added these results under the same link as in the previous post.
  11. I did a series of tests having the ability to attach performance graphs using the instrumented drivers to the Microsoft Management Console. This is how I was setup (changed these values specifically for this test). From the Nvidia control panel: Anisotropic filtering: App. contr Antialiasing Gamma: On Antialiasing Mode: App. contr Antialiasing Transparency: Off Conformant texture clamp: Off Error reporting: Off Extension limit: Off Force Mipmaps: None Multi-display.... : Single Display..... Texture filtering Anisotropic sample.. : Off -"- Negative LOD bias: Allow -"- Quality: High Performance -"- Trilinear opt: Off Threaded opt: Off Triple Budd: Off V-sync: Force off I ran the game with: jpg-name says model and texture detail level Resolution: 1280x1024 Vsync: off AA / MS: off High priority process: On Battle: Allahs Fist you can find the results I got here: http://www.saunalahti.fi/~hotti/cmsf/ The graphs are explaine under the graphs but the thick green line is the one measuring the FPS. I would say this is a more reliable way of measuring the FPS since you dont get fraps slowing down the computer. First of all these graphs clearly show how just by lowering down the graphic detail every step makes it a bit better. I think biggest differences can be seen between fastest-faster and and balanced-improved. As you can see with fastest graphic details it gets more and more spikes with over 100 FPS. What is sort of odd to me is the GPU_idle % which seems to be at 26% on best settings and then drops down to 16% at better and then steadily rising from there. Also I noticed that if I scroll high up to the sky so the land disappears beneath me it doesnt affect FPS at all. Also if I turn the camera so that I can see the sun only I get a huge boost in FPS. Also viewing down towards the land so I cant see the landscape in the distance gets me huge boost of FPS (those spikes in many of the graphics settings I tested). Could it be possible to change the same "horizon image" that is a very low version in the fastest settings to be used also in all the other graphics details settings. If this will improve the game there is something wrong with handling the textures?
  12. Desertfox: Unfortunately no I cannot. I do have an older computer with a gf6800gt but unfortunately its an AGP card and my current mobo doesnt have an AGP slot. And also that machine is running a linux system which I cannot shut down apart from a very short time. One thing I'm seeing here is that my mobo is also using the Intel P965 chipset.
  13. Darn. So were back to square one with this. Well I will keep poking around with various stuff to see if I can find more.
  14. In case you didnt read the beginning of this thread. I am running a diagnostics program for opengl applications with a set of diagnostics drivers made by nvidia and wether I have your settings or not I get the same errors, the game is falling back to software rendering. what part of this you dont understand? I will try to write this so you too can understand what I'm saying: 1. I get better FPS as you said yes that is true and the game runs smoothly with my setup too and I never questioned this. (and all i did btw is tune down the settings to "balanced" leaving AA etc for CM SF to handle) 2. It still doesnt fix the problem which I can clearly see every time I run CM: SF with these diagnostics drivers from nvidia that the game falls back to software rendering. Hey if you think you solved it, download the PerfKit and see for yourself! Install the set, configure GLExpert to show all error (tick all the boxes) and tell it to show them in console. Fire up the game and as soon as you click one unit it will pop up the errorconsole with the notification of falling back to software rendering. 3. What you do is you lower the gfx detail which frees up the resources of your CPU to produce better Frames-Per-Second 4. Since your CPU is doing the calculating of the graphics too obviously lowering gfx detail gets better FPS thus the reason your solution "seems" to work. As does my version of tuning down the graphics to fastest. It gets even better results because the processor gets even less load. If you don't understand what I am doing then I can't help you any further. The problem STILL persists and I am allready wasting my time with you.
  15. Sorry to say this guys but looking "under the hood" (with nvidias GLExpert) this is actually no workaround. All it does is lower the load on your CPU to produce bigger FPS because currently the game is running in software mode (in other terms it isnt using the 3D acceleration of your hardware) and naturally since your processor has to do all the stuff that your gfx card(s) should be doing lowering the graphics will have huge impacts on FPS. So dont get your hopes up.
  16. As stated in topic I would like to know which extensions does CM: SF use? Would it be possible to make some sort of build (of demo perhaps?) or a very small build with one big map, a few units and some terrain etc. Because I noticed I started getting a few errors (non fatal though) when running the tests of the OpenGL Extension Viewer from realtek-vr. These errors occured after the tests moved up to the 1.5 (vertex objects) test but they were not the same ones I get with CM SF. I was just wondering could you build some sort of test build of the gfx engine of CM: SF where you could control which extensions are in use?
  17. But your "workaround" actually works around nothing related to this problem we are having. Sure it increases your (and as I said it does increase mine too but you misunderstand that I'm looking for better FPS. NO. I'm looking to fix this problem which your workaround does not.) FPS but it is only because the load that the CPU (not the GPU as it SHOULD be) gets is lower because of lower level graphics. Actually I have a better "workaround" than yours. Turn all graphics settings to "Fastest" and it will show a HUGE increase in FPS (but still the same errors occur as with the setting set to "Best")!!! And again this workaround solves as much as yours, and that is nothing. If you like paying 500 euros for a graphics card which doesnt seem to do SQUAT atm (since that IS what software rendering means, your 3D hardware is NOT in use!!) fine by me. What I want is that the game would actually use the 3D hardware my 8800GTS has. I will say this again, I am NOT trying to increase FPS at all costs or at any costs, I'm trying to find out why the game cannot use my 3D hardware. And this "workaround" doesn't make the 3D hardware active when running the game. Same errors as with Best gfx settings. You and I are talking about two totally different things here. So please lets stay on the topic of this thread! So no more workarounds here please. If you have some info about the OpenGL drivers and why they are acting like they are post THAT info here. [ September 05, 2007, 02:46 PM: Message edited by: Hotti ]
  18. Okay I just tried this "workaround" of yours and it actually works around nothing. The same errors pop up all in the same places as they do in with better gfx settings in my earlier experiments. I did notice an increase in FPS but its propably just because gfx detail was lowered enough for my processor (Intel C2D E6400) to handle everything better. (tested with Allahs Fist again) Renderer still keeps falling back to software rendering. If my processor alone can manage to make the game work like this I should be running like 400fps with my GF8800GTS hardware which has as much power alone as a 8 year old computer. I too am starting to believe that this is not a driver issue but that there is something very very wrong in the very core of the CM2x graphics engine. Maybe you coded the very game itself in software rendering mode without even knowing it? [ September 05, 2007, 01:58 PM: Message edited by: Hotti ]
  19. But the thing is I'm not looking for a workaround.. I'm looking for whatever it might be causing the problems and trying to find something that could FIX it. Imo workaround is not a fix. I dont think it will help BFC at all if there is a workaround. I'm hoping the code that fixes multi-core problems will help a bit but I really want to find WHY this thing is happening in the first place. Also what strikes me odd is that all the other problems with opengl and other games can be fixed one way or another from what I've been reading from Nvidia forums but none of these help with CM: SF.
  20. You can always return the license to the eLicense server and you have another activation. Unlike with Bioshock the eLicense system actually works. I have other titles that use eLicense and just go to control panel -> eLicense panel and return the token to the server. That is if you have access to the old computer But no, the PerfKit shouldnt (at least it didnt on my computer) screw up stuff.
  21. Knaust1: the problem with 8xxx series has been since (release) version 1.01 of CM: SF. Dirtweasle: Just to give you a heads up the PerfKit package needs its own driver (which is provided with the package). It gives some sort of special access to the GPU so it can monitor the activity. The driver is called an "instrumented driver".
  22. Knaust1: you DO know there are other CM games than Shock Force? There is Beyond Overlord, Barbarossa to Berlin and Afrika Korps. Shock Force is the first game using the CM2X engine. The older games use the CM1X engine. But this is not the thread for this. I think it would be helpful is someone else could also try the PerfKit set from nvidia and see if you get the same problem. Also the OpenGL Extension Viewer data reference could be helpful.
  23. Yes the older games run just fine.. Just tried a 40 turn quickbattle and it went from start to finish without any problems.
  24. I have an update on the error I managed to dig out.. I noticed that the error doesnt occur before I click and select a unit. And every time I click a unit it creates two errors (the icon flashing blue and green?!?). Also when a unit (abrams in Allahs Fist battle) moves it starts to generate more and more of these errors. Edit: I did some more poking around again... This time I downloaded the OpenGL Extension Viewer: http://www.realtech-vr.com/glview/download.html This program allowed me to get a real big pile of information about my OpenGL driver and what it can do. There was this report page in which in the end there were these lines: </font><blockquote>code:</font><hr /><pre style="font-size:x-small; font-family: monospace;"> Extension verification: GL_EXT_copy_texture was not found, but has the entry point glCopyTexSubImage3DEXT GL_EXT_subtexture was not found, but has the entry point glTexSubImage3DEXT GL_NV_half_float has the entry point glVertexWeighthNV missing! GL_NV_half_float has the entry point glVertexWeighthvNV missing! WGL_NV_vertex_array_range was not found, but has the entry point wglAllocateMemoryNV WGL_NV_vertex_array_range was not found, but has the entry point wglFreeMemoryNV GL_HP_occlusion_test was not found, but is available in driver version 2.1.0 GL_NV_framebuffer_multisample_ex was not found, but is available in driver version 2.1.0 GL_NV_texture_compression_latc was not found, but is available in driver version 2.1.0 GL_OES_conditional_query was not found, but is available in driver version 2.1.0 </pre>
  25. Well... As I also have a dualboot system with Windows XP and Vista Business Edition I could try reverting my current vista (beta 163.44 for bioshock) drivers to 158.18 and see if I can notive any provement. But only later today.
×
×
  • Create New...