Jump to content

BFC, What the heck is going on?


Recommended Posts

Hey guys,

I have performed some tests on my computer which is the following:

AM2 AMD 4600+

2GB PC6400 DDR2

MSI K9 NEO F

7900 GTO (really a 7900 GTX) with driver version 94.24. These drivers may be outdated but are regarded as one of the best for the 7x series graphics card.

At 1024x786 and 8x AA, 16x AF I get an average of 25 fps looking over an empty map in the scenario editor (300mx300m).

When I add an entire Stryker batallion the framerate drops an additional 5 fps to 20 fps.

It might be safe to conclude that some of the terrain is wrecking my performance. It becomes even more obvious in cities where my framerate drops town to a mere 7-8fps. Turning AA and AF off has little impact (saved 1 or 2 fps) and even reducing the 3d model quality to fastest provides little improvement.

And it gets better, whereas I have 25 fps looking over a flat empty 300x300m with everything set to best and 8xAA and 16xAF, it manages to go down to 20 fps when I set texture quality to "fastest".

In addition, the game rarely, if ever, uses more than 35% of my processor, so it does not seem like a CPU issue. Furthermore, memory usage does not seem to exceed 1GB, thus I would assume all is well in that department.

Finally, I think it is safe to say that even though this graphics card is not among the top models of today, it is hardly "an old piece of hardware" and runs other very demanding games extremely smooth at the highest settings.

Also, because I'm still on a 17" CRT (yeah I know tongue.gif ) 1024x786 is hardly high resolution these days and the card should keep up with that very well indeed.

I hope you guys can provide some answers as the game is _really_ unplayable for me and I'm really starting to enjoy it. It has improved so much over the past few months and while I'm really enthousiastic about what has been achieved I can't enjoy it at all.

Regards,

Gryph

Link to comment
Share on other sites

As a supplement to my previous post. I have done some more testing and switching off vertical sync seemed to give a good performance boost. However, when I now looked at trees they were completely leafless and now these wrecked performance (looking into a huge city gave me 50 or so fps, checking out a couple of trees kicked it into the 10fps range). buildings seemed to do far less damage than with vsync on.

This is how the trees look with vsync off:

trees.jpg

The plam trees however, don't give these problems.

Anyone else seeing anything similar at all? Or am I alone here tongue.gif

Regards,

Gryph

Link to comment
Share on other sites

At low resolutions like 1024x768, the bottleneck is more your CPU than the video card.

BTW is that 4600+ X2 or a single? I ask because my (2.5Ghz) 4800+ x2 uses 50% (equaling 100% on a single core CPU), which makes me think CM isn't using as much horse power as it should in your case.

[ February 28, 2008, 02:12 PM: Message edited by: Pzman ]

Link to comment
Share on other sites

it's a dual core socket AM2, I don't think they even sell the 4600+ single core :D the processor seems fine however when running some tests with it. In addition, I have very little background applications running on this pc anyway (pure gaming and some specialised apps) so that might explain the lower CPU usage.

Besides that, this rig runs some very CPU intensive engineering applications too, and has little trouble coping with those, although CPU usage is a bit higher then ;)

Even so, still doesn't explain my agent orange dipped trees, nor the absolutely massive framerate hit when they are in view tongue.gif

Oh yeah, this is the BFC version of CMSF btw.

Link to comment
Share on other sites

What graphics settings do you have? Have you tried lowering the AA and AF settings. Its not like you have a high end graphics card there. I'm using a 8600GT and don't have that issue.

Have you tried using newer drivers for the card? Or do you use those because other games have issues with newer drivers?

[ February 28, 2008, 03:03 PM: Message edited by: Pzman ]

Link to comment
Share on other sites

Tried all AA and AF settings, same deal accross the board.

The 7900GTX outperforms the 8600GT (or should) so it's clearly not videocard related. In fact, the 7900GTX is almost twice as powerful, wouldn't call that low end ;)

And then, I'm getting approx 60 fps overlooking an empty 300x300m map with 8xAA and 16xAF with vsync turned off, bang in an ENTIRE stryker batallion on those settings and it still gives me 40fps. However, pop in 5 (again leafless) trees and it goes on it's knees. Something must be wrong with the optimisation of buildings/trees in respectively vsync and no vsync setting.

This set of drivers is the best for the 7x, the newer drivers are simply not optimised for the 7 series and anything above 169x or so won't even work on a 7 series card.

Link to comment
Share on other sites

I just wanted to point out that splitting the workload acroos multiple cores doesn't seem to be as efficient as turning off threading optimization in my Nvidia settings and just letting one core do the work.

I tried it both ways with 25% utilization each across four cores and with full utilization on one core (threading off) and the single core gave me substantially higher frames at least on the bottom end of the spectrum (startup of city ruins jumped from 5 fps to 12 fps on single core). However, I'm not sure if the highest MAXIMUM frame rates are achieved in single or multiple core setup.

Link to comment
Share on other sites

Originally posted by Gryphon:

Hey guys,

I have performed some tests on my computer which is the following:

AM2 AMD 4600+

2GB PC6400 DDR2

MSI K9 NEO F

7900 GTO (really a 7900 GTX) with driver version 94.24. These drivers may be outdated but are regarded as one of the best for the 7x series graphics card.

Finally, I think it is safe to say that even though this graphics card is not among the top models of today, it is hardly "an old piece of hardware" and runs other very demanding games extremely smooth at the highest settings.

Gryph

That's actually a fairly powerful card even by today's standards. I've seen benchmarks that show it runs circles around my 8600GTS. Of course, that's not saying much :( Poor buying decision on my part...need to do my homework better next time as Nvidia seems to be in the "mislead your customers" as well as the video card business.

BTW, can anyone confirm if the 512 MB cards or the ones with 256 bit bus width (opposed to 128 bit) perform substantially better in CMSF? I'm seriously thinking of jumping into the 8800GT which is supposed to be one of the best bangs for the buck out there......

Link to comment
Share on other sites

The Nvidia 9600GT (512MB VRAM standard) just came out, for under $200... that is your best bang for your buck. Benchmarks show its about on par with the 8800GT in many tasks.

I just got my system, but wanted a low end graphics card (EVGA 8600GT Superclocked model)to hold me over till the summer, when I snap up something a little nicer. I'm thinking the Nvidia 9600GT or the ATI Radeon HD3870, which is in the same price range, and slightly out preforms the 9600GT in most games.

Link to comment
Share on other sites

Speed wise, the 9600GT kicks 8600GTS in every way.

Look at the bench marks yourself on, Toms hardware and AnandTech.

For example:

bench2a.png

full review:

http://www.tomshardware.com/2008/02/21/nvidia_geforce_9600_gt/

I must say that its a good card for the price. Not as good at every task as the 8800s, but that is why Nvidia is going to release the 9700 series later this year.

One thing to consider, CMSF is going to be hard on any video setup, so it seems since even people with 8800GTX models still struggle with frame raters with the graphics settings on high. You must just have to bump down your settings a little. I run it on Improved on my 8600GT.

[ February 29, 2008, 06:28 PM: Message edited by: Pzman ]

Link to comment
Share on other sites

"ALT-T" should bring twigs n leafs back.

While working on some mod, I figured that scaling down textures by half (from 1024x1024 to 512x512) helps some in improving performance. At least it worked by scaling down trees. GFX quality does not suffer noticably IMO. I´ll do it for almost all textures in my game/data folder to see how much performance increase can be squeezed out.

Note: I did unpack all resource files in my DATA folder and in fact I´m not using any *.brz files at all anymore. Allways direct access to texture and data files and the game still runs perfectly, as well as appears to load things a bit faster IMO. smile.gif

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...