Jump to content

Irratic Framerate Issue


Hister

Recommended Posts

3 minutes ago, Hister said:

So CM games run more smooth on Mac's then on Windows? 

That is not what that means.

If as you know there is much more to a smooth experience than memory, or graphics card or CPU it is the combination. If you were to spring for a new Mac I am certain you would experience much better performance but you could do that with new PC hardware too. The nice thing about Macs is you are sure to not get a system with a week link. But you pay a premium for the system.

Link to comment
Share on other sites

3 minutes ago, IanL said:

That is not what that means.

If as you know there is much more to a smooth experience than memory, or graphics card or CPU it is the combination. If you were to spring for a new Mac I am certain you would experience much better performance but you could do that with new PC hardware too. The nice thing about Macs is you are sure to not get a system with a week link. But you pay a premium for the system.

OK, thank oyu for the explanation.

Ha, I was never into Macs plus these things cost a fortune over here. I have 80 Steam games and 14 non Steam ones (Uplay, GOG, etc.) which are mostly only for PC and not for Mac. There's no way I would swap it just to make sure CM games play well. :)

Link to comment
Share on other sites

Quote

 

System Requirements MINIMUM:

  • Operating System: Win7/Win8
  • Processor: Pentium IV 1.8 GHz or equivalent speed AMD processor
  • Video Card: 256 Megabyte dedicated VRAM or better and must support 1024x768 or higher resolution in OpenGL (Intel integrated video may produce blurry text)
  • Sound Card: DirectX 10 compatible Sound Card
  • System Memory 2 GB RAM
  • Hard Drive Space: 3.5 GB
  • Other requirements: DVD drive (for hardcopy version only)
  • The game does not work in a virtualized environment (virtual machine)

System Requirements SUGGESTED:

  • Operating System: Windows 10
  • Processor: Pentium IV 2.8 GHz or equivalent speed AMD processor or better
  • DirectX 12 compatible Sound Card
  • Video Card: 1 GB dedicated VRAM or better and must support 1024x768 or higher resolution in OpenGL (Intel integrated video may produce blurry text)
  • System Memory 4 Gigabyte or more RAM
  • Hard Drive Space: 5 GB
  • Other requirements: DVD drive (for hardcopy version only)
  • The game does not work in a virtualized environment (virtual machine)

 

 

These are minimum and suggested system requirements stated on Battlefront store page for CMBS (other games are the same, only hard drive space differs naturally) - 4 gigs of RAM is indeed suggested. 

I have much better AMD CPU then the one suggested and 100% more VRAM and am getting those issues and can only play up to large missions with 3D models and textures set to lowest. That does not bode well with the suggested requirements. Does the suggested system requirement mean you can max out all the ingame settings and play any stock scenario with 20+ frames? Or does this mean something else entirely? If they would be correct I shouldn't be getting so low frames (that is if my rig does not indeed have an issue). Note that most of the games I own pose no particular issues like the CM games do with a few exceptions (but admitted by the developers the problem was on their side and not on mine - AGEOD games, Ultimate General Civil war for example). I know, probably none that I own are built upon OpenGl chasis.  

Anyway, if there are any other insights to this I am all ears, I'm resorting to playing the game on the lowest settings until I buy a new rig and see if I'll have more luck for the 3rd time. :D 

Edited by Hister
Link to comment
Share on other sites

I did some testing a while back and tried it again with the Linnet example above. To me it seems like one of the biggest loads on FPS are the amount of units on screen at the same time, which is exacerbated when panning. This might include fortifications like in the Linnet setup. One way to improve FPS in this example is to maintain a camera angle but zoom in. The less amount of units the better FPS. In a way this makes sense as in the orders phase no action or movement can put a load on the computer, but I suppose it needs to keep track of where the units are in the 3d environment while redrawing what is on screen when panning. The testing I did before, IIRC, also improved during the action phase, that is a lower camera angle and zoomed in a bit improved FPS. Now, I don't know if that is something that BFC can improve on given the current engine, but one can hope.

Link to comment
Share on other sites

Hister,

Your issue should really be in its own thread in the Tech Support Forum.  This really isn't the place for it.  I suggest rounding up all your information and starting a new thread there.  I can think of a few things to suggest already.  But some general info for everybody seeing this thread... CM2's core graphics code was written between 2005 and 2006 on OpenGL.  Although it has been revised many, many, many times inherently it's using code written for old hardware standards.  Gameplay speed and performance improves with advances in hardware, but there's lots of features/techniques that we're not using because we'd have to rewrite the game's core.  That's not possible.

The single biggest FPS crunch comes when trying to animate large amounts of stuff.  Ground level limits your world view and the number of units, hence FPS tends to improve when going down to the ground.  That's normal.

For massive scenarios, 20fps for high up with all quality settings turned up to full is not considered "slow" by CM standards. Lower than that is not what I'd expect to see from a good system.

Steve

Link to comment
Share on other sites

@Hister

At IanL's request for assistance, I'll my info. (Hey, if Steve wants you to move this, I'd suggest a Moderator move the entire thread...)

I've got a machine with roughly similar specs. 

GTX 670 (4GB of Vram, vs the "standard" card's 2GB) vs. your GTX 660Ti

Let's dive into those, for the moment.

I've got mine driving a 1920x1080 screen. Yours is pushing a widescreen 2560x1080. So, simple maths tell us that your card has a 33% greater load on it...for every frame...than mine does. Looking at hardwarecompare.com (which only lists the 2GB version of my card), some interesting stats pop up: Mine is 24% better at Firestrike; 33%  better Memory Bandwidth; 33% better pixel rate. None of this is directly transferable to CM performance, but it does give a sense of how our cards stack up.

I feel comfortable stating that the 670 4GB is probably about 33% better performance than the 660Ti. (Call it 25% if you want. Shrug. These are rough comparisons.) Given that performance difference AND the extra pixels your card is pushing, with our cards and our screens, I'd expect my 670 to perform about 75% better. (1.33 * 1.33 = 1.76. 33% better performance times 33% fewer pixels). Rough numbers.

(By the way, I'm working this out as I type. That performance difference is FAR greater than I'd expect.

CPU: the machine I'm looking at has an AMD FX 8350 (running stock at 4GHz). It has 16GB of Ram (crap! I just checked: the RAM is only at 1600MHz. WTF?) , running Win10 on an AMD mobo (Asus M5A88M, bios v1702). (After this, I'm going to have to rip into that RAM and see why I have it so low.)

Comparing CPUs, there is about the same difference in performance.

Overall, I'd expect about 33% to 50% better performance than you're seeing. Let's boot it up and find out...

Huh. I cannot find "The Copse". Back in a jiff...

Edited...and I'm back.

Loaded "BP1 The Copse". All in-game options as high as they go, same screen location as you and FRAPS shows 15. More, anon. (This is worse than I thought...but still smooth. It is also the lowest I see. If I move the camera around, my fps goes up to 60 (where I have it capped).)

 

Edited by c3k
Link to comment
Share on other sites

25 minutes ago, c3k said:

I've got mine driving a 1920x1080 screen. Yours is pushing a widescreen 2560x1080. So, simple maths tell us that your card has a 33% greater load on it...for every frame...than mine does.

I was waiting for Hister to start up a new thread before addressing this, but since you brought it up and I moved the old content to a new thread, time to dig into this one :D

I'd say this is the #1 likely cause of problems for Hister.  It's simple math... the more polygons on the screen, the more strain that is put on the hardware.  Having a huge scenario seen from a high altitude with a massive screen size setting and good quality settings is simply not going to work out very well.  So first thing I'd advise Hister is to change reduce the screen resolution down to something more reasonable and see how that affects overall performance.

A quick explanation of what I call "porpoising", which is a term commonly used with power boats to describe the bow going up too much then down too much when under power.  This not only interferes with good performance from the boat, but it can also mean losing your lunch :D

What's happening is when the graphics are favorably aligned the system can handle everything and the speed goes up, but as soon as anything happens that crosses some sort of capability line, the speed crashes.  Up and down, up and down, up and down as the conditions alternate between favorable and unfavorable.

The solution is to figure out why the hardware is getting overwhelmed.  As stated, the most likely suspect is the massive screen resolution.  Turn that down and you will, at the very least, increase your top speed and decrease how slow things get.  You might still porpoise, but it will be less extreme and therefore less noticeable.  If it's still unacceptable to you, then figure out what else can be changed to reduce the strain on the card.  Smaller screen resolution, lower the quality settings in some way, turn off fancy card features, etc.  There's no one simple answer because there's far too many individual PC specific variables at play.

Steve

Link to comment
Share on other sites

Well, I decided to fire up "BP1 The Copse" on a much more powerful machine...and only got 20fps with that initial shot. (Now, 20 is 33% better than 15...). This machine is MUCH better than the FX8350/670 rig. I do keep all options checked and I have not dug into my various control panel AA settings and whatnot.

@Hister, I'll dig into this a bit. Remember, though: My fps went way up once I moved the camera around. There may be something with one of the high-res (close LOD) models which eats up gpu cycles. And, I'm running some mods. In order to get apples to apples, we'd have to rip into all that stuff.

Link to comment
Share on other sites

Heya guys, thank you all for the help!

I stated already that I had lower resolution monitors in the past on the same two rigs on which I played CM. I was experiencing the same issues already with the two smaller monitors but for the heck of it I will plug in smaller native resolution monitor and do the testing.

Note that ultrawide monitors don't need 33% more power then classic 1080p monitors in practice (normal math doesn't apply here) but I regress, let's wait for the test results to come in.

Link to comment
Share on other sites

28 minutes ago, Hister said:

Note that ultrawide monitors don't need 33% more power then classic 1080p monitors in practice

Actually, they do :D  It's a simple thing... the more polygons shown on your screen, the more processing power it takes to put them there.  Seeing more of the battlefield is what you get when you pump up the screen resolution, therefore you put additional strain on your system.  It's as much of a law of physics as is gravity.

Steve

Link to comment
Share on other sites

Just to chime in on this, I run all my CM games on a laptop with a 1920*1080 screen but most of the time have it hooked up to an external screen that is 1920*1200. Only slightly larger, but I find that the games run smoother on it than on the smaller built in screen. Perhaps some other explanation fot this, but I'm not really a tech wiz.

Link to comment
Share on other sites

3 minutes ago, rocketman said:

Just to chime in on this, I run all my CM games on a laptop with a 1920*1080 screen but most of the time have it hooked up to an external screen that is 1920*1200. Only slightly larger, but I find that the games run smoother on it than on the smaller built in screen. Perhaps some other explanation fot this, but I'm not really a tech wiz.

Traditionally, video out is not as good as internal video on laptops.  Has to do with costs and tight space, I'm guessing.  So it could be that.

Steve

Link to comment
Share on other sites

13 hours ago, Battlefront.com said:

To add to c3k's comments, don't completely trust the FPS number you see.  Instead, use it along with your sense of how smooth things are running.  It is entirely possible that the FPS number might be lower than what your eye perceives due to the way FPS values are sampled.

Steve

Any chance of getting an FPS counter hot key added to the game for end users at some point? At least then the FPS result will be coming from the same source for when users report problems.

Should add that I've always had issues with FPS across multiple machines I've played the games on, and I splurge on top end graphics cards (mostly for other titles) and CPUs when it's time for a new desktop. It's not as bigger issue for a game like CM compared to other AAA titles that's for sure, but with a 4GB VRAM card and plenty of regular RAM it does still jar a bit when the FPS counter dips and the draw distance for detailed terrain textures are so close to my viewpoint at some elevations.

Edited by Ithikial_AU
Link to comment
Share on other sites

OK guys, here I am. 

I "undusted" the two old monitors I have. Their native resolutions are 1600x1050 and 1280x768 and I tested both scenarios on them.

 

BP1 The Copse scenario:

The 1600x1050 monitor gave me 24 frames in that same camera position with all the same ingame settings. 

The 1280x768 monitor gave me 29 frames in that same camera position with all the same ingame settings. 

 

BP1 Op Linet II a scenario:

The 1600x1050 monitor gave me 7 frames in that same camera position with all the same ingame settings. 

The 1280x768 monitor gave me 7 frames in that same camera position with all the same ingame settings. 

I also tried lowering the ingame settings to fastest/fastest on both monitors in this scenario but to no avail - framerate remained at 7/8. 

 

18 hours ago, Battlefront.com said:

Actually, they do :D  It's a simple thing... the more polygons shown on your screen, the more processing power it takes to put them there.  Seeing more of the battlefield is what you get when you pump up the screen resolution, therefore you put additional strain on your system.  It's as much of a law of physics as is gravity.

Steve

Well, I have a friend in UK who has 1920x1080 and 2560x1080 and told me he experienced no strain to little more strain on 2650 over to 1920 on the same games but that was probably because his GPU wasn't maxed on 1920 if I think about it, because if it wasn't then the results he experienced would probably not be the same especially after what you are saying Steve is a common law of physics.  But then again if your game would follow the common laws of physics then all more powerful rigs would have to churn out more frames out of your game but that is not the case because there are all sorts of bottlenecks present on either side that you already stated on numerous occasions. See! ;)

I read people claiming the strain to be from 5% up to 30% more on 2560 vs 1920 depending of the game tested and gpu used so it is not that you would for sure get 33% more strain and thus 33% less frames in every game on every GPU due to there being 33% more pixels to churn out. 

 

OK, back to the results of the test. The 1280x768 monitor gave me 40 frames in that same camera position with the 3D models and textures ingame settings lowered to fastest/faster. When panning the camera around it went up to 60 frames in that scenario and mostly kept at 50. Still there's no way I would swap my current monitor for that old square abomination of a monitor, he he. I prefer to play it on current one with ingame settings set to fastest/faster since I discovered it gives me above 30 frames in  most situations (except in the biggest scenarios of course) in exchange for more blurry textures and very close drawing line (which is less pronounced due to all the textures being more blurry so does not pop up as much anyway).

 

Since I am getting low frames on the biggest scenarios regardless of the graphical options used then the bottleneck might as well not be the GPU but the CPU. Would also explain why George MC who gave me the GPU wasn't experiencing the same issues on his rig. I also got 3 frames on the Arnheim scenario at the moment when my British paras all together started to shoot on the advancing Germans on the bridge while before the shooting started my frames were much higher. Maybe the game and AMD 6300 FX just don't go well together. Or certain combos of AMD cpu's and nvidia GPU's clash with thew way game calculates things. I don't know. I would like to know. Could there be any other faulty hardware piece I have like the RAM not being of the right frequency or something? The system states I have 8 gigs of RAM installed so it's not that I've put them in in the wrong slots or only one would be working.

Now is not the time to be buying a new GPU due to prices being artificially mined and am awaiting the new technology to arrive in February or so (then miners can go f*** themselves).

I re-read all the forum topics regarding the performance issues people are experiencing and discovered a dissonance between what developers deem acceptable game performance and what certain users see as such (not those over 70 who can't spot a difference between 15 and 60 frames on their screens, lol). Knowing that 20 to 30 frames is what you must expect from the game makes one more at ease. When I'll get older i might also not spot a difference in screen fluidity and by then I guess I'll be fully satisfied with the way CM games perform. ;)      

 

 

Link to comment
Share on other sites

2 hours ago, Hister said:

OK, back to the results of the test. The 1280x768 monitor gave me 40 frames in that same camera position with the 3D models and textures ingame settings lowered to fastest/faster. When panning the camera around it went up to 60 frames in that scenario and mostly kept at 50. Still there's no way I would swap my current monitor for that old square abomination of a monitor, he he. I prefer to play it on current one with ingame settings set to fastest/faster since I discovered it gives me above 30 frames in  most situations (except in the biggest scenarios of course) in exchange for more blurry textures and very close drawing line (which is less pronounced due to all the textures being more blurry so does not pop up as much anyway).

 

Since I am getting low frames on the biggest scenarios regardless of the graphical options used then the bottleneck might as well not be the GPU but the CPU. Would also explain why George MC who gave me the GPU wasn't experiencing the same issues on his rig. I also got 3 frames on the Arnheim scenario at the moment when my British paras all together started to shoot on the advancing Germans on the bridge while before the shooting started my frames were much higher. Maybe the game and AMD 6300 FX just don't go well together. Or certain combos of AMD cpu's and nvidia GPU's clash with thew way game calculates things. I don't know. I would like to know. Could there be any other faulty hardware piece I have like the RAM not being of the right frequency or something? The system states I have 8 gigs of RAM installed so it's not that I've put them in in the wrong slots or only one would be working.

No big scenarios for you then. My first machine had trouble with large scenarios but never as bad as what you are experiencing. I was always able to get the fps up to 15 or so which is enough for not to painful camera movement by dropping the image quality. It was not a particularly satisfying experience. So, I bought a new machine. Thankfully I was not the only person in my house that had trouble with the performance of the old machine. That made getting approval from the finance department easier :)

2 hours ago, Hister said:

Now is not the time to be buying a new GPU due to prices being artificially mined and am awaiting the new technology to arrive in February or so (then miners can go f*** themselves).

Yeah, I'm with ya there.

2 hours ago, Hister said:

I re-read all the forum topics regarding the performance issues people are experiencing and discovered a dissonance between what developers deem acceptable game performance and what certain users see as such (not those over 70 who can't spot a difference between 15 and 60 frames on their screens, lol). Knowing that 20 to 30 frames is what you must expect from the game makes one more at ease.

The engine we have is what we have. When you are specking out your new machine some day I would recommend spending a bit more on the mother board (in other words get one that has a faster buss rather than the cheapest available) and focus on a better CPU over a better GPU. Obviously there is a quite a steep curve at the top end of CPUs and GPUs all I mean is if you have x dollars to spend on a CPU and a GPU spend more than half on the CPU since extra $ there will give you better bang for you buck. Also it seems that you might be better off getting a higher GHz i5 than a slower GHz i7 since CM is mostly single processor.

2 hours ago, Hister said:

When I'll get older i might also not spot a difference in screen fluidity and by then I guess I'll be fully satisfied with the way CM games perform. ;)     

LOL you young pups. Think of all the time you have to look forward to playing the game. :D

Link to comment
Share on other sites

30 minutes ago, IanL said:

The engine we have is what we have. When you are specking out your new machine some day I would recommend spending a bit more on the mother board (in other words get one that has a faster buss rather than the cheapest available) and focus on a better CPU over a better GPU. Obviously there is a quite a steep curve at the top end of CPUs and GPUs all I mean is if you have x dollars to spend on a CPU and a GPU spend more than half on the CPU since extra $ there will give you better bang for you buck. Also it seems that you might be better off getting a higher GHz i5 than a slower GHz i7 since CM is mostly single processor.

OK, I'll resort to the help provided by you guys suggesting what motherboard, CPU, RAM and GPU to get when the time comes. I'll know who to blame if my performance won't be OK. ;) 

Link to comment
Share on other sites

On ‎10‎/‎24‎/‎2017 at 10:52 AM, c3k said:

There may be something with one of the high-res (close LOD) models which eats up gpu cycles.

I have noticed some LOD issues here and there, like texture pop, and incorrect LOD adjustment, I.E. a lower poly model popping in at a distance when higher poly models are all around it.
Nothing that should cause any real framerate issues though, I record framerate every time I play, and have seen no huge anomalies.

 

3 hours ago, Hister said:

BP1 The Copse scenario:

The 1600x1050 monitor gave me 24 frames in that same camera position with all the same ingame settings. 

The 1280x768 monitor gave me 29 frames in that same camera position with all the same ingame settings. 

 

BP1 Op Linet II a scenario:

The 1600x1050 monitor gave me 7 frames in that same camera position with all the same ingame settings. 

The 1280x768 monitor gave me 7 frames in that same camera position with all the same ingame settings. 

I also tried lowering the ingame settings to fastest/fastest on both monitors in this scenario but to no avail - framerate remained at 7/8.

That's because Op Linnet II isn't a scenario, it's a torture chamber, designed by an unrepentant sadomasochist. ;)

Link to comment
Share on other sites

@Hister

Some comments.

Your buddy who stated that the 2560 vs 1920 screen shouldn't be that much of a strain...must be talking about other games. Those games may be cpu bound, not gpu bound. In that case, the 33% greater load your widescreen places on the gpu is not the limiting factor. However, be sure that 33% greater pixels on a screen is, and will be, a 33% greater load on the gpu. Every time. It may not cause a 33% change in fps...but that's because of other factors limiting the framerate.

For CM series, as IanL stated and you've discovered, there's not a lot of link between gpu and framerate. This game is MUCH more dependent on data throughput than other games. On one of my rigs I've got CM on a hard drive. A WD Black. This particular rig has 64GB of RAM. That doesn't matter. The load times are ridiculous. I cannot imagine the amount of data being pulled off the drive and manipulated. Putting CM on an SSD makes a HUGE load time difference. That gives an inkling of the data being pushed around.

For smoothness, I've found that using Freesync/Gsync gives great results with CM. Those low frame rates don't seem so slow when it's all linked.

Ken

Link to comment
Share on other sites

28 minutes ago, SLIM said:

That's because Op Linnet II isn't a scenario, it's a torture chamber, designed by an unrepentant sadomasochist. ;)

Ha ha, OK, I will steer clear of it then. :D

 

25 minutes ago, IanL said:

LOL, good thing you don't know where we live then. :)

Hmmm, I'll just ask the prime minister Justin of your whereabouts - he seems a nice enough guy to help someone like me with my mission. ;)

 

10 minutes ago, c3k said:

Your buddy who stated that the 2560 vs 1920 screen shouldn't be that much of a strain...must be talking about other games. Those games may be cpu bound, not gpu bound. In that case, the 33% greater load your widescreen places on the gpu is not the limiting factor. However, be sure that 33% greater pixels on a screen is, and will be, a 33% greater load on the gpu. Every time. It may not cause a 33% change in fps...but that's because of other factors limiting the framerate.

OK, got it. 

10 minutes ago, c3k said:

For CM series, as IanL stated and you've discovered, there's not a lot of link between gpu and framerate. This game is MUCH more dependent on data throughput than other games. On one of my rigs I've got CM on a hard drive. A WD Black. This particular rig has 64GB of RAM. That doesn't matter. The load times are ridiculous. I cannot imagine the amount of data being pulled off the drive and manipulated. Putting CM on an SSD makes a HUGE load time difference. That gives an inkling of the data being pushed around.

Yes but you see, my CPU is much better then the one suggested in system requirements... That's the whole gist of it.

When I got me a SSD loading times were much shorter for the game yes. 

13 minutes ago, c3k said:

For smoothness, I've found that using Freesync/Gsync gives great results with CM. Those low frame rates don't seem so slow when it's all linked.

Well, I bought my ultra-wide this year and  the ones with Freesync/Gsync were way too expensive. It would be really cool to own one though - interested in seeing how it works. Knowing what I know now I would also probably not buy 1920x1080 IPS with  Freesync/Gsync because that would still be 3 times more expensive then what I splurged on the current one and I'm loving all the extra monitor space. Plus I didn't know back then I still don't know now if I'll buy me an AMD or Nvidia GPU. Do newest AMD gpu's still experience problems with CM games? 

Link to comment
Share on other sites

Freesync is free. AMD only charges a few cents for screen manufacturers to use the licensed technology. The "cost" comes in when you realize that only AMD cards can utilize Freesync. Nvidia charges ~$200 (+) for the Gsync license fees. Those screens show that price. Again, only Nvidia cards can use Gsync. (Some of the Gsync monitors are 144Hz. Those are more expensive than vanilla 120Hz and then they also add in the Gsync fees.)

CPU. Although the game is cpu bound, there is more to it than just "does my cpu meet minimums or recommended".  CM, like most games, is not even close to multi-core capable. It has offloaded one of its subroutines to a second core (if available), but most of the game runs on a single core. (This is a simplification...because that's about the limit of my understanding. ;) ) Core speed is more important than multicores or thread throughput. (Most cpus are pushing multi these days, and bragging on it. None of that helps CM.) So, a good core speed is more important than more cores. (I wouldn't put less than 4 cores/4 threads in anything these days. Gotta let some background stuff run...) In addition to the cpu core speed, the data throughput is important. I don't know what that means wrt CM. I do know that "better" rigs run CM faster. I don't know if bus width, memory controller, RAM speed, RAM size, swap file, cache size, pipelines, or anything else matters more than any other.

All else being equal (big weasel words there), an AMD 12 core/24 thread "Threadripper" (gotta love their marketing names!) at 3.5GHz will not run the game any faster than an i3 2 core/2 thread cpu at the same 3.5GHz. AIUI. Maybe some day...

A balanced system with emphasis on data manipulation rather than video creation will provide a better CM experience. IMO.

Link to comment
Share on other sites

3 minutes ago, c3k said:

Freesync is free. AMD only charges a few cents for screen manufacturers to use the licensed technology. The "cost" comes in when you realize that only AMD cards can utilize Freesync. Nvidia charges ~$200 (+) for the Gsync license fees. Those screens show that price. Again, only Nvidia cards can use Gsync. (Some of the Gsync monitors are 144Hz. Those are more expensive than vanilla 120Hz and then they also add in the Gsync fees.)

CPU. Although the game is cpu bound, there is more to it than just "does my cpu meet minimums or recommended".  CM, like most games, is not even close to multi-core capable. It has offloaded one of its subroutines to a second core (if available), but most of the game runs on a single core. (This is a simplification...because that's about the limit of my understanding. ;) ) Core speed is more important than multicores or thread throughput. (Most cpus are pushing multi these days, and bragging on it. None of that helps CM.) So, a good core speed is more important than more cores. (I wouldn't put less than 4 cores/4 threads in anything these days. Gotta let some background stuff run...) In addition to the cpu core speed, the data throughput is important. I don't know what that means wrt CM. I do know that "better" rigs run CM faster. I don't know if bus width, memory controller, RAM speed, RAM size, swap file, cache size, pipelines, or anything else matters more than any other.

All else being equal (big weasel words there), an AMD 12 core/24 thread "Threadripper" (gotta love their marketing names!) at 3.5GHz will not run the game any faster than an i3 2 core/2 thread cpu at the same 3.5GHz. AIUI. Maybe some day...

A balanced system with emphasis on data manipulation rather than video creation will provide a better CM experience. IMO.

Auuuch, I was only checking the Gsync monitors thinking Freesync would fall in the same pricerange. Ugh. Well, good to know, thanx! 

Suggested CPU is Pentium IV 2.8 GHz. My AMD 6300 FX has 3.5 GHz! So it must be something else then that.  

Link to comment
Share on other sites

As far as I'm aware the 'minimum' and 'recommended' specifications haven't really been changed/updated since CMSF in 2007. The engine has been updated significantly since then with regards to graphics. This is especially true of the 'version 2' engine that actually reduced the 3D model complexity and replaced it with bump-mapping to provide the missing model detail - with the idea of improving video performance. However there may be more calculations for the CPU to perform with the newer engines (more complex LOS/LOF, etc.).

I would generally assume that an AMD FX 6300 should have been a decent performer for CM games. Admittedly newer Intel CPUs will have an advantage over the AMD FX series since they execute more 'instructions per clock' (IPC) resulting in better single-core performance (comparing at the same clock speed). To my knowledge the only code that is 'multi-core/thread' in CM is the scenario loading process. It was one of the few places in the code that could easily benefit from more cores/threads without having to significantly change the engine. Interestingly AMD GPUs suffer a huge hit in the scenario loading process for some reason. I can only assume that there is some weakness in the AMD OpenGL video drivers that is getting hit hard by the loading process (perhaps this is one of those processes that has to run on the CPU instead of the GPU for AMD as Steve mentioned earlier). Running a Nvidia GPU on the same system could potentially result in load times that are 2 - 3 times faster. Of course the issue in this thread isn't scenario loading times.

As others have pointed out, you're running a huge scenario for comparison; something that tends to slow down even the fastest machines. This may be something of a CPU limitation/bottleneck, but I have no idea what is possibly being 'calculated' during screen movement that could result in lower FPS. In general CM video performance is slowed down by the number of units on screen, the number of buildings, the amount of trees and the complexity of the map (the number and magnitude of elevations). With a larger horizontal resolution ('2560') than the average '1980' your 'window' is much larger for what needs to be rendered on screen. The CM graphics engine tends to cut down on textures (their number and detail) after a certain range in order to maintain 'performance'. I don't know what the algorithm is for these calculations, but they may be a bit dated in their assumptions of GPU performance and memory capacity. However it is quite probable that those 'assumptions' may not be easily changed with the current engine. There are also fewer LODs for each model that other major games may have more of ('art' is expensive), which could result in somewhat smoother framerates if there were more LOD level models/textures.

I have an FX 8320 which also runs at 3.5GHz (it just has 8 cores instead of 6 like the FX 6300 has) and a GTX 660Ti that I could test (on a cheap Asrock motherboard with an AMD 880 chipset). I'll have to install CMBN and see what I get, but I suspect the performance will be pretty much similar to what you are seeing already.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...