Jump to content
iMolestCats

CM performance thread

Recommended Posts

Yeah, I'm silly. Thank you for pointing it out Steiner - I somehow missed the play button. Uuumpgh. My testing is void, will have to make a new one to be comparable with other tests here.

Share this post


Link to post
Share on other sites

Damn, can't play my game - get a license error out of the blue. Already submitted a ticket, hope they solve this fast so I can finish this testing properly...

Share this post


Link to post
Share on other sites

FRAPS RESULTS:

minimum - maximum - average framerate:

Shadows/Shaders: ON: 10 / 20 / 13.86

Shadows OFF/Shaders ON: 18 / 34 / 24.23

Shadows ON/Shaders OFF: 11 / 21 / 15.65

Shadows/Shaders: OFF: 18 / 34 / 24.608

HARDWARE:

OS: Win8 x64

CPU:i5-3570k CPU @ 3.4GHz

Mainboard: Asrock z77 pro 3

RAM: 8GB DDR3

Graphics card (overclocked?): GeForce GTX 275 (no) driver 9.17.10.2932

Graphics memory: 896 MB

Hard Disk: G.Skill Phoenix Pro SSD 128 GB

SETTINGS

Screen resolution of the game: 1920x1080

In game model quality: Best

In game texture quality: Best

Trees (full/distant): Full

GRAPHICS CARD SETTINGS

Anti-aliasing: 2x (also 2x in game)

Texture filtering: 2x (high quality)

Notes : Full trees, for me, give better performance than distant

Looks like vram makes a lot of difference

Share this post


Link to post
Share on other sites

Yes. I just wanted to keep the benchmark as consistent as possible. I think all testers should use the same settings to get the most accurate results

Share this post


Link to post
Share on other sites
Yes. I just wanted to keep the benchmark as consistent as possible. I think all testers should use the same settings to get the most accurate results
Well, you might have a point but bare in mind that then not only ingame settings would have to be equal but also settings in Nvidia control panel/ATI Radeon control panel.

My game is up and running again solved by a gsClean utility sent to me by battlefront technician. We aren't sure what triggered the error.

Anyways, I'm getting the impression my game runs faster now after gsClean utility was run.

So I rerun the test - this time I locked the chosen unit and pressed the play button like I would have to do in the first place. :o Z folder removed from the data folder.

FRAPS RESULTS

Avg: 35.500 - Min: 22 - Max: 44

SETTINGS

Screen resolution of the game: 1280x1024

Vertical Synchronisation on/off: off

In game model quality: balanced

In game texture quality: improved

Shadows on/off: on

Shaders on/off: on

Trees (full/distant): full

GRAPHICS CARD SETTINGS

Anti-aliasing: 16xCSAA

Texture filtering: 16x

HARDWARE

Manufacturer + model:

OS: Windows 7 64bit

CPU: AMD FX-6300 BOX 3,5 GHz - 14MB Cache - 95W

Mainboard: Asus AM3+ M5A97 (970 ATX)

RAM: DDR3 1600 8GB CL8 Corsair 2x4GB Vengeance

Graphics card: Geforce 550 Ti (MSI)

Graphics memory: 1Gb

Hard Disc (SSD): Samsung 840 Pro 128GB

All at vanilla settings, nothing over-clocked.

Edit 1:

It's the first time that I noticed changing 3D model quality (in-game + and - keys) changes the drawing distance of grass (bigger the quality further grass renders and more dense it looks) and also how far away ground and vegetation textures swap to lowered/blurred resolution. Having it at "improved" makes for a visual sweet-spot for me while still retaining enough FPS (going from balanced to improved I lose 1/4 of overall FPS). Having it at better is too much of a FPS drop to be worth it (FPS lowers exactly by 1/2 when compared to balanced!). So I'll be having either balanced or improved 3D model from now on depending on the overall FPS situation in given scenario. It's good you can balance this in-game and not having to go to main menu's settings. It would be very neat if the same could be done with texture quality setting.

FRAPS RESULTS

Avg: 35.000 - Min: 22 - Max: 43

In game texture quality: fastest

Note: I get horrendous/unacceptable close up visuals (all the textures very blurred).

FRAPS RESULTS

Avg: 32.633 - Min: 21 - Max: 40

In game texture quality: balanced

FRAPS RESULTS

Avg: 26.167 - Min: 16 - Max: 34

In game texture quality: improved

Note: Notice a very big difference from my first test in this post when I had exactly the same settings! Why is this so? I haven't done any changes at all, just rerun the test!

FRAPS RESULTS

Avg: 34.967 - Min: 25 - Max: 44!!!

In game texture quality: better!!!

Note: This is where common sense and expectations get of the dodo. FPS should be lower then those from improved setting but are instead higher!

FRAPS RESULTS

Avg: 35.667 - Min: 25 - Max: 44 !

In game texture quality: best!

Note: I get the highest FPS when setting the in-game texture quality to best with accompanying nice visuals. Another funky situation going against the common logic! Same as before with antialiasing setting. Might be worth notifying all those with Nvidia cards to see if that holds true to them to.

Edit 2:

FRAPS RESULTS

Avg: 33.017 - Min: 20 - Max: 41

In game model quality: balanced

In game texture quality: best

Note: Again inconsistency with prior testing. Will have ingame settings set up that way from now on. Seems as the best FPS/visual compromise.

FRAPS RESULTS

Avg: 28.467 - Min: 18 - Max: 36

In game model quality: improved

In game texture quality: best

Note: FPS scale down according to expectations when 3D model quality is heightened. When camera moves to the left with the AFV it's giving me non-smooth transition so I opted not to use improved 3D model quality. Having it at balanced gives me smooth experience.

Edit3:

Avg: 35.767 - Min: 22 - Max: 44

In game model quality: balanced

In game texture quality: best

Maximum pre-rendered frames (Nvidia control panel): 1 (had set it to 4 prior to that)

Note: Looks like having it at 1 improves FPS but I can't be really sure now due to the noted inconsistencies (running multiple tests with same settings gives me different FPS's).

Share this post


Link to post
Share on other sites
Well, you might have a point but bare in mind that then not only ingame settings would have to be equal but also settings in Nvidia control panel/ATI Radeon control panel.

If we are looking for any consistency, we all need to use the same video card control panel settings. We are already removing mods to eliminate that variable from the equation. I really think that we should coordinate as many settings - in game and out - as possible. I'm not sure how much we will learn if everyone is using different control panel or ingame settings.

Just my two cents. :)

Share this post


Link to post
Share on other sites

Could not edit my previous post any more.

I ran 3 tests (without reloading the scenario from save games menu) with exactly the same settings which show that there is not much FPS difference between the 3.

2013-03-12 13:49:21 - CM Normandy

Frames: 2146 - Time: 60000ms - Avg: 35.767 - Min: 22 - Max: 44

2013-03-12 13:59:17 - CM Normandy

Frames: 2068 - Time: 60000ms - Avg: 34.467 - Min: 22 - Max: 44

2013-03-12 14:02:36 - CM Normandy

Frames: 2189 - Time: 60000ms - Avg: 36.483 - Min: 23 - Max: 45

Am was wondering if the same is true if I reload the scenario.

2013-03-12 14:18:03 - CM Normandy

Frames: 2184 - Time: 60000ms - Avg: 36.400 - Min: 26 - Max: 45

So as you can see difference this time was 4 frames for min FPS while the rest were the same.

Looks like there is a FPS consistency in this regard.

I'm even more buffled as to why I got such a different result before using exactly the same settings:

Avg: 35.500 - Min: 22 - Max: 44

and

Avg: 26.167 - Min: 16 - Max: 34

Edit: Did a test with putting my z folder back where more then 4Gb of my mods reside and got the following result:

2013-03-12 14:31:52 - CM Normandy

Frames: 2143 - Time: 60000ms - Avg: 35.717 - Min: 23 - Max: 44

Mods in my case don't make any FPS difference while scenario loading time is just a bit longer.

Share this post


Link to post
Share on other sites
Guest

I discovered last night - while looking into a bug - that folks with Asus Geforce cards have had serious performance trouble with the Asus "crap that comes with the card" software installed. So just bare drivers and Nvidia tools would be best, no extra Asus overclocking utilities or anything, as they may cause nasty issues. Just FYI.

A note that's more on topic: performance of control panel settings will change, irrespective of Combat Mission, with new driver releases. So when using different control panel settings you're really measuring card and driver performance more than Combat Mission performance (case in point, Hister's 16xCSAA performance, which is entirely down to his drivers and has nothing to do with anything CM is doing).

Panel settings have nothing to do with Combat Mission - you're just forcing your card to use those settings while rendering Combat Mission. Very different things, as obviously we can't change any of that, AND it's not necessarily long-term relevant to your (or others', if they see your results and buy your card) card's performance with CM. Again FYI.

Share this post


Link to post
Share on other sites
I discovered last night - while looking into a bug - that folks with Asus Geforce cards have had serious performance trouble with the Asus "crap that comes with the card" software installed. So just bare drivers and Nvidia tools would be best, no extra Asus overclocking utilities or anything, as they may cause nasty issues. Just FYI.

A note that's more on topic: performance of control panel settings will change, irrespective of Combat Mission, with new driver releases. So when using different control panel settings you're really measuring card and driver performance more than Combat Mission performance (case in point, Hister's 16xCSAA performance, which is entirely down to his drivers and has nothing to do with anything CM is doing).

Panel settings have nothing to do with Combat Mission - you're just forcing your card to use those settings while rendering Combat Mission. Very different things, as obviously we can't change any of that, AND it's not necessarily long-term relevant to your (or others', if they see your results and buy your card) card's performance with CM. Again FYI.

That makes sense Phil, thanks.

Share this post


Link to post
Share on other sites

I do understand that Phil.

I have an issue namely that if I set the in-game antialias/multisample options setting to on and do no change to my control panel I get no anti-aliasing at all and the visuals are exactly the same if I set this option to off. In order for me to get antialiasing in the game I need to set antialiasing mode to override any application seeting and then set antialiasing setting accordingly.

Game actually makes me do it that way if I want to have any antialiasing in it!

Share this post


Link to post
Share on other sites
Guest

Well... not as such. It sounds like your driver / card combo won't allow us to activate AA if it's turned off in the control panel (presumably if you *didn't* have a separate settings profile it would?). So it's to do with how your driver interacts with the control panel, and CM doesn't have any control over that - that game isn't making you do anything, your driver is. :)

I'm definitely not saying that you guys should, while playing, turn all those cool options off, just that CM doesn't have any control over them (and in your example, Hister, it sounds like *they're* affecting CM without any recourse on our end), and you guys may want to take that into account for performance results.

Share this post


Link to post
Share on other sites

Interesting.

My deafult antialiasing setting in nvidia control panel is set to application controlled. So when I have this setting set like that and enable in-game antialisaing/multisample option to on I get no antialiasing, everything is jagged as hell as one would experience when no AA is applied.

So you are saying that normally such a setting (AA application controlled in nvidia control panel and in-game antialisaing/multisample option set to on) would have to produce antialiasing in the game?

Maybe it's connected to having a separate profile for this game - I will test this out and report here meaning I will get rid of the special nvidia control panel profile for CMBN and set it as mentioned in the paragraph above.

How much antialiasing does this game have by default when you enable in-game antialiasing/multisample option to on? 2x, 4x, 8x?

As this is a second time you are pointing out my observation has nothing to do with your game I will make a statement here:

I'm not implying CMBN has something funky going on underneath it's graphical belly. I acknowledge that what I'm experiencing is most likely related to my hardware/driver combo which are not optimised well for this game and not vicus versa.

Just so that I get you more at ease, he he.

I'm definitely not saying that you guys should, while playing, turn all those cool options off, just that CM doesn't have any control over them (and in your example, Hister, it sounds like *they're* affecting CM without any recourse on our end), and you guys may want to take that into account for performance results.
Well I thought that is self evident. To get comparable results all the settings in nvidia control panel would have to be the same amongst the testers and all in-game settings to. To simplify this all Nvidia GPU owners testing here could in control panel go to Adjust image setting with a preview and Use my preference emphasizing - Quality. That would be the simplest solution to that and all testers will have same specs when it comes to software. That's to get really proper test results with which we could see how well certain hardware combo performs if that is what we are after.

Share this post


Link to post
Share on other sites
Guest

Oh, yeah, no need to put me at ease. :) This is a well-behaved thread and has been a joy to contribute to so far.

I'm repeating myself not because I don't think you get it (I know you do!) but because I'm answering not just you, but everybody else who is wondering about what you're wondering about, but may not have read the whole thread or caught my earlier points. I expect this to be a heavily read thread. So... yeah, I may repeat myself a bit. :) (Edit: And also say some things that probably seem self-evident. ;) )

I would be much obliged if you'd see if not having a control panel entry still didn't allow AA - that'd be a bug then. The normal AA should be 4x, depending on your hardware. So it *should* be noticeable, although probably not as pretty as the higher settings.

Share this post


Link to post
Share on other sites

Am was wondering if the same is true if I reload the scenario.

2013-03-12 14:18:03 - CM Normandy

Frames: 2184 - Time: 60000ms - Avg: 36.400 - Min: 26 - Max: 45

So as you can see difference this time was 4 frames for min FPS while the rest were the same.

Looks like there is a FPS consistency in this regard.

The min and max values are what they are: a snapshot. If the OS is making a critical operation in the moment when the game is demanding most of the power, then the rate can easily be influenced and if the OS is "silent" while the game is least demanding, the highest rate will not be influenced. Therefore the average framerate is valuable since transient fluctuations do not play such a big role for this result.

It's also never wrong, if a result seems strange, to make two more tests to exclude transient impacts.

Share this post


Link to post
Share on other sites
Oh, yeah, no need to put me at ease. This is a well-behaved thread and has been a joy to contribute to so far.

I'm repeating myself not because I don't think you get it (I know you do!) but because I'm answering not just you, but everybody else who is wondering about what you're wondering about, but may not have read the whole thread or caught my earlier points. I expect this to be a heavily read thread. So... yeah, I may repeat myself a bit. (Edit: And also say some things that probably seem self-evident. )

Ok, I see, carry on lads then! ;)

I would be much obliged if you'd see if not having a control panel entry still didn't allow AA - that'd be a bug then. The normal AA should be 4x, depending on your hardware. So it *should* be noticeable, although probably not as pretty as the higher settings.
OK, just to clear this up. So you want me to delete special profile in nvidia control panel for CMBN? Have global nvidia control panel settings with AA set to appliacation controlled and in-game AA setting on?

I can do that sure. Will report tomorrow since my GF insists I depart from my PC now and drive her watch the Oz movie or something.

Hister,

could you add test results with model BEST/ textures BEST/ shadows ON/ shaders ON (the most beautiful look of CM, the optimum)?

I did so. Here is the result:

2013-03-12 18:25:34 - CM Normandy

Frames: 1066 - Time: 60000ms - Avg: 17.767 - Min: 11 - Max: 26

As yuo can see 3D model quality set to best cripples my FPS. When camera pans around with the turret I can notice "picture tearing" which is far from the smooth visual experience when camera pans around if I set it to balanced.

Share this post


Link to post
Share on other sites

Graphics Card Settings off

13RJntO

13RJpBT

minimum - maximum - average framerate:

Shadows/Shaders: ON: 17 / 33 / 23.146

Shadows OFF/Shaders ON: 25 / 46 / 33.728

Shadows ON/Shaders OFF: 20 / 35 / 26.08

Shadows/Shaders: OFF: 27 / 48 / 36.654

HARDWARE:

OS: Win8 x64

CPU:i5-3570k CPU @ 3.4GHz

Mainboard: Asrock z77 pro 3

RAM: 8GB DDR3

Graphics card (overclocked?): GeForce GTX 275 (no) driver 9.17.10.2932

Graphics memory: 896 MB

Hard Disk: G.Skill Phoenix Pro SSD 128 GB

SETTINGS

Screen resolution of the game: 1920x1080

In game model quality: Best

In game texture quality: Best

Trees (full/distant): Full

Ingame antialias is on

post-31711-141867624414_thumb.jpg

post-31711-141867624415_thumb.jpg

Share this post


Link to post
Share on other sites

Okay just picked up my new computer, no more than two weeks old. And this is a little scary...

FRAPS RESULTS:

Min: 16 - Max: 21.6 - Avg: 20.4

HARDWARE

Manufacturer + model:

OS: Windows 7 SP 1

CPU: i7 - 3770k 3.50 GHz (not overclocked)

RAM: DDR3 16GB - 2400MHz Corsair Dominators

Graphics card (overclocked?): Gigabyte GTX690 (no)

Graphics memory: 4GB DDR5

SETTINGS

Screen resolution of the game: 1920 x 1200 (16:10)

In game model quality: Best

In game texture quality: Best

Shadows on/off: On

Shaders on/off: On

GRAPHICS CARD SETTINGS

Anti-aliasing: 16x CSAA

Texture filtering: Application Controlled

A few tests with no real difference to that above results. The 16x CSAA option and the stock standard settings in the game have no real difference. (maybe 1 FPS). I have mods installed but that only subtracts 1-2 FPS. My previous computer had a GTX580 and a slower CPU (i7 920 at 2.67 GHz) and got the same type of frame rates when moving to CMBN 2.00. No disabling one of the GPU's.

Though the game is playable, the fact I can get 110 FPS in Shogun 2 with max settings this is a little scary. :D Hope this helps the BF programming team. Anything else you want me to test let me know.

Share this post


Link to post
Share on other sites

Phil, I'm sorry for giving you a false alarm - having AA mode set to application controlled and in-game antialiasing/multisample turned to on works as advertised - I get anti-aliasing. Where I probably fubared was that I missed reading the "Takes effect after restarting the game" small text below the aforementioned option. :eek:

It doesn't matter if I have a separate or a global profile for CMBN - having AA mode at application controlled and in-game setting to on gives me antialiasing. Glad that's solved and no bug is involved.

The min and max values are what they are: a snapshot. If the OS is making a critical operation in the moment when the game is demanding most of the power, then the rate can easily be influenced and if the OS is "silent" while the game is least demanding, the highest rate will not be influenced. Therefore the average framerate is valuable since transient fluctuations do not play such a big role for this result.

It's also never wrong, if a result seems strange, to make two more tests to exclude transient impacts.

That's so true Steiner! Just now I did 3 tests with equal settings in nvidia control panel and in-game settings. These are the results:

2013-03-13 12:05:16 - CM Normandy

Frames: 2405 - Time: 60000ms - Avg: 40.083 - Min: 27 - Max: 49

2013-03-13 12:11:25 - CM Normandy

Frames: 1879 - Time: 60000ms - Avg: 31.317 - Min: 21 - Max: 43

2013-03-13 12:13:20 - CM Normandy

Frames: 2394 - Time: 60000ms - Avg: 39.900 - Min: 28 - Max: 50

Note the second test's average! I think that by all means it's necessary to run at least 2 if not even 3 tests of the same settings to get a more solid result.

Share this post


Link to post
Share on other sites

FRAPS RESULTS:

(Final number is the math average. For actual experiential average, add 1-2 FPS to rightmost number. See detailed notes at bottom.)

14/20/17

HARDWARE

Manufacturer + model: DIY build

OS: W7-64 Ult

CPU: I7-950 at 3.07 GHz (NOT OC)

Mainboard: Asus Sabertooth X58

RAM: 6 GB

Graphics card (NOT OC): EVGA (Nvidia) GTX 550 Ti 2 GB

Graphics memory: 2 GB

Graphics drivers: 314.07

SETTINGS

Screen resolution of the game: 1920 x1080

Vert sync: ON

AA/MS: ON

Hi-Priority: ON

In game model quality: BEST

In game texture quality: BEST

Shadows on/off: ON

Shaders on/off: ON

Trees (full/distant): ON FULL

Mods: Z removed

GRAPHICS CARD SETTINGS: DEFAULT

Additional:

(Final number is the math average. For actual experiential average, add 1-2 FPS to rightmost number. See detailed notes at bottom.)

Shadows Off: 21/27/24

Shaders Off: 14/19/16.5

Shadows and Shaders Off: 23/29/26

More detailed (FPS as experienced during play. Dashes represent from-to while executing turns/moving):

Shadows and Shaders On:

1st advance: 18-19

L Turn: 14-17

2nd advance: 17

R Turn: 15-18

Final advance to fence: 15

Shadows Off:

1st advance: 27

L Turn: 21-24

2nd advance: 27-30

R Turn: 24-27

Final advance to fence: 22

Shaders Off:

1st advance: 18-19

L Turn: 14-15

2nd advance: 17-18

R Turn: 16-18

Final advance to fence: 15

Shadows and Shaders Off:

1st advance: 28-29

L Turn: 23-25

2nd advance: 28-29

R Turn: 27-28

Final advance to fence: 23

NOTE 1: No difference noticed between fresh system boot and non. 3 tests run with similar results.

NOTE 2: I didn't have time to test FPS values on lesser model quality, but on a different scenario with tall brick and stone walls, I notice occassional vertical lines along tile divisions on better model quality and many, many such lines on improved model quality. There are no such lines using best model quality as I normally do.

Share this post


Link to post
Share on other sites
Okay just picked up my new computer, no more than two weeks old. And this is a little scary...

Wow. Your system is quite a bit more powerful than mine and that's only giving you 1-2 FPS over mine.

Looks like no need to think about upgrading until at least Bagration or Bulge.

Share this post


Link to post
Share on other sites

Doing another batch of testing - since I settled with my in-game settings it's time to fiddle with Nvidia Control panel Settings. Testing the change of only one control panel option at a time for better overview.

CMBN in-game settings

Display Size: Desktop (1280x1024)

Vertical Synchronisation: Off

3D Model Quality: Balanced

3D Texture Quality: Best

Antialias/Multisample: On

High Priority Process: On

Nvidia Control Panel Settings

Anisotropic filtering: Application controlled

Antialiasing - FXAA: Off

Antialiasing - Gamma Correction: On

Antialiasing - Mode: Enhance application setting

Antialiasing - Setting: 16xCSAA

Antialiasing - Transparency: Multisample

Maximum Pre-rendered Frames: Application controlled

Power Management: Maximum performance

Texture Filtering - Anisotropic sample optimization: Off

Texture Filtering - Negative LOD Bias: Allow

Texture Filtering - Quality: High quality

Texture Filtering - Trilinear Optimization: On

Threaded Optimization: Auto

Triple Buffering: Off

Texture Filtering Anisotropic Filter Optimization: Off

2013-03-13 12:36:13 - CM Normandy

Frames: 2146 - Time: 60000ms - Avg: 35.767 - Min: 23 - Max: 44

2013-03-13 12:39:15 - CM Normandy

Frames: 2332 - Time: 60000ms - Avg: 38.867 - Min: 23 - Max: 54

2013-03-13 12:53:26 - CM Normandy

Frames: 2395 - Time: 60000ms - Avg: 39.917 - Min: 27 - Max: 50

--------------------------------------------------------------------------

Nvidia Control Panel Settings

Anisotropic filtering: 16x

Antialiasing - FXAA: Off

Antialiasing - Gamma Correction: On

Antialiasing - Mode: Enhance application setting

Antialiasing - Setting: 16xCSAA

Antialiasing - Transparency: Multisample

Maximum Pre-rendered Frames: Application controlled

Power Management: Maximum performance

Texture Filtering - Anisotropic sample optimization: Off

Texture Filtering - Negative LOD Bias: Allow

Texture Filtering - Quality: High quality

Texture Filtering - Trilinear Optimization: On

Threaded Optimization: Auto

Triple Buffering: Off

Texture Filtering Anisotropic Filter Optimization: Off

2013-03-13 13:16:00 - CM Normandy

Frames: 2315 - Time: 60000ms - Avg: 38.583 - Min: 26 - Max: 49

2013-03-13 13:19:04 - CM Normandy

Frames: 2475 - Time: 60000ms - Avg: 41.250 - Min: 29 - Max: 50

2013-03-13 13:20:14 - CM Normandy

Frames: 2459 - Time: 60000ms - Avg: 40.983 - Min: 29 - Max: 50

Note: Few more FPS's without any apparent visual improvement.

--------------------------------------------------------------------------

Nvidia Control Panel Settings

Anisotropic filtering: Application Controlled

Antialiasing - FXAA: On

Antialiasing - Gamma Correction: On

Antialiasing - Mode: Enhance application setting

Antialiasing - Setting: 16xCSAA

Antialiasing - Transparency: Multisample

Maximum Pre-rendered Frames: Application controlled

Power Management: Maximum performance

Texture Filtering - Anisotropic sample optimization: Off

Texture Filtering - Negative LOD Bias: Allow

Texture Filtering - Quality: High quality

Texture Filtering - Trilinear Optimization: On

Threaded Optimization: Auto

Triple Buffering: Off

Texture Filtering Anisotropic Filter Optimization: Off

2013-03-13 13:29:02 - CM Normandy

Frames: 2137 - Time: 60000ms - Avg: 35.617 - Min: 23 - Max: 44

Note: Bad visual quality - lots of jaggies present the further the objects in the game are. Up close it's OK. Not an option at all for graphic whores amongst us especially since there's no FPS improvement.

--------------------------------------------------------------------------

Nvidia Control Panel Settings

Anisotropic filtering: Application controlled

Antialiasing - FXAA: Off

Antialiasing - Gamma Correction: On

Antialiasing - Mode: Override any application setting

Antialiasing - Setting: 16xCSAA

Antialiasing - Transparency: Multisample

Maximum Pre-rendered Frames: Application controlled

Power Management: Maximum performance

Texture Filtering - Anisotropic sample optimization: Off

Texture Filtering - Negative LOD Bias: Allow

Texture Filtering - Quality: High quality

Texture Filtering - Trilinear Optimization: On

Threaded Optimization: Auto

Triple Buffering: Off

Texture Filtering Anisotropic Filter Optimization: Off

2013-03-13 13:37:31 - CM Normandy

Frames: 2282 - Time: 60000ms - Avg: 38.033 - Min: 26 - Max: 48

2013-03-13 13:39:00 - CM Normandy

Frames: 2229 - Time: 60000ms - Avg: 37.150 - Min: 25 - Max: 46

2013-03-13 13:40:07 - CM Normandy

Frames: 2218 - Time: 60000ms - Avg: 36.967 - Min: 26 - Max: 47

Note: Not much difference with Enhance application setting (1st test).

--------------------------------------------------------------------------

Nvidia Control Panel Settings

Anisotropic filtering: Application controlled

Antialiasing - FXAA: Off

Antialiasing - Gamma Correction: On

Antialiasing - Mode: Application controlled

Antialiasing - Setting: Application controlled

Antialiasing - Transparency: Multisample

Maximum Pre-rendered Frames: Application controlled

Power Management: Maximum performance

Texture Filtering - Anisotropic sample optimization: Off

Texture Filtering - Negative LOD Bias: Allow

Texture Filtering - Quality: High quality

Texture Filtering - Trilinear Optimization: On

Threaded Optimization: Auto

Triple Buffering: Off

Texture Filtering Anisotropic Filter Optimization: Off

2013-03-13 14:30:33 - CM Normandy

Frames: 2119 - Time: 60000ms - Avg: 35.317 - Min: 22 - Max: 45

2013-03-13 14:32:31 - CM Normandy

Frames: 1806 - Time: 60000ms - Avg: 30.100 - Min: 20 - Max: 42

2013-03-13 14:33:44 - CM Normandy

Frames: 1798 - Time: 60000ms - Avg: 29.967 - Min: 20 - Max: 42

Note:As you can see again, application controlled is actual AA 4x in-game and AA16x gives me better FPS rate.

--------------------------------------------------------------------------

Testing continues...

Share this post


Link to post
Share on other sites

Very interesting!

The rigs some of you have, should blow my results to pieces, but compared to your higher calculation power, the increased memory bandwidth and the GFX-power, you get almost no advantage.

This very bad scaling with much higher overall system power (CPU + memory bandwidth + GFX) IMO hints to a software dependent brake.

If i would guess, somewhere in the software a procedure seems to wait for a certain result before advancing to the next frame/timeslot.

Share this post


Link to post
Share on other sites
Guest

Thanks guys, this is helpful.

Okay just picked up my new computer, no more than two weeks old. And this is a little scary...

FRAPS RESULTS:

Min: 16 - Max: 21.6 - Avg: 20.4

My first thought: drivers. People with less beefy systems are getting better frame rates than you. That sounds like an issue with your drivers, or perhaps your card's OpenGL performance (much more likely your drivers though). More on that below.

Though the game is playable, the fact I can get 110 FPS in Shogun 2 with max settings this is a little scary. :D Hope this helps the BF programming team. Anything else you want me to test let me know.

Shogun 2 is doing, on average, about 40-50x less processing in a particular battle scene than CM is. I play the TW series too, understand fairly well how it works on the back end, and I can comfortably say that with rendering, pathfinding, and AI, we're doing way more of all of them. On top of that, CM is using OpenGL, which uses entirely different parts of your drivers than Shogun.

My guess is your drivers have an OpenGL bug, which is sadly pretty common - OpenGL is much more portable but DirectX gets more support from Nvidia and ATI due to Microsoft's influence. Long story short: driver updates may improve your performance significantly without either of us having to change anything, so that's nice. :)

That said, I do want to point out that we do look for ways to optimize. Problem is that we've been doing it for years, and we're pretty good at our jobs! By this point we've either tried, implemented, or ruled out pretty much all of the applicable optimizations. CM just... does a lot.

Phil, I'm sorry for giving you a false alarm - having AA mode set to application controlled and in-game antialiasing/multisample turned to on works as advertised - I get anti-aliasing. Where I probably fubared was that I missed reading the "Takes effect after restarting the game" small text below the aforementioned option. :eek:

It doesn't matter if I have a separate or a global profile for CMBN - having AA mode at application controlled and in-game setting to on gives me antialiasing. Glad that's solved and no bug is involved.

Very cool! Thanks for checking that out. I'm glad there's no bug either - that probably would have been a monster to sort out. :)

Wow. Your system is quite a bit more powerful than mine and that's only giving you 1-2 FPS over mine.

Looks like no need to think about upgrading until at least Bagration or Bulge.

Upgrading may not significantly increase your frame rates. What it MAY do is make you more able to play larger scenarios more smoothly. That's not as counter-intuitive as it sounds. Averaging even 1-2 more FPS more will help a lot if there's a heap of processing happening and everything takes a dip.

Share this post


Link to post
Share on other sites

×