Jump to content

Hister

Members
  • Posts

    1,962
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by Hister

  1. 33 minutes ago, Schrullenhaft said:

    As far as I'm aware the 'minimum' and 'recommended' specifications haven't really been changed/updated since CMSF in 2007. The engine has been updated significantly since then with regards to graphics. This is especially true of the 'version 2' engine that actually reduced the 3D model complexity and replaced it with bump-mapping to provide the missing model detail - with the idea of improving video performance. However there may be more calculations for the CPU to perform with the newer engines (more complex LOS/LOF, etc.).

    Ha, update the requirements then to match the real demands of the game would be my snob answer (if they really need updating that is). ;) 

    35 minutes ago, Schrullenhaft said:

    I would generally assume that an AMD FX 6300 should have been a decent performer for CM games. Admittedly newer Intel CPUs will have an advantage over the AMD FX series since they execute more 'instructions per clock' (IPC) resulting in better single-core performance (comparing at the same clock speed). To my knowledge the only code that is 'multi-core/thread' in CM is the scenario loading process. It was one of the few places in the code that could easily benefit from more cores/threads without having to significantly change the engine. Interestingly AMD GPUs suffer a huge hit in the scenario loading process for some reason. I can only assume that there is some weakness in the AMD OpenGL video drivers that is getting hit hard by the loading process (perhaps this is one of those processes that has to run on the CPU instead of the GPU for AMD as Steve mentioned earlier). Running a Nvidia GPU on the same system could potentially result in load times that are 2 - 3 times faster. Of course the issue in this thread isn't scenario loading times.

    Ok so then the AMD processors are really not the best choice when it comes to the CM games. Did you consider adding a warning note next to the CM products that users with AMD CPU's are more prone to encounter less smooth gameplay or turn it in a positive way and say something like game is optimized for Intel CPU users? Might hurt your sales, I know but fair is fair.  

     

    41 minutes ago, Schrullenhaft said:

    As others have pointed out, you're running a huge scenario for comparison; something that tends to slow down even the fastest machines. This may be something of a CPU limitation/bottleneck, but I have no idea what is possibly being 'calculated' during screen movement that could result in lower FPS. In general CM video performance is slowed down by the number of units on screen, the number of buildings, the amount of trees and the complexity of the map (the number and magnitude of elevations). With a larger horizontal resolution ('2560') than the average '1980' your 'window' is much larger for what needs to be rendered on screen. The CM graphics engine tends to cut down on textures (their number and detail) after a certain range in order to maintain 'performance'. I don't know what the algorithm is for these calculations, but they may be a bit dated in their assumptions of GPU performance and memory capacity. However it is quite probable that those 'assumptions' may not be easily changed with the current engine. There are also fewer LODs for each model that other major games may have more of ('art' is expensive), which could result in somewhat smoother framerates if there were more LOD level models/textures.

    Thank you for taking the time to explain all of this. I already knew what the gist of the game is so I'm not trying to compare it to other AAA titIes - it's a special gem with all the quirks that come with it. Since you mentioned my ultrawide monitor I also provided a test with two smaller screen monitors in one of my previous posts and there was no difference in that huge scenario - frames remained at 7 or 8 no matter the level of detail I chose on either monitor so exactly the same as on my ultra-wide one. 

     

    47 minutes ago, Schrullenhaft said:

    I have an FX 8320 which also runs at 3.5GHz (it just has 8 cores instead of 6 like the FX 6300 has) and a GTX 660Ti that I could test (on a cheap Asrock motherboard with an AMD 880 chipset). I'll have to install CMBN and see what I get, but I suspect the performance will be pretty much similar to what you are seeing already.

    Well if you do take the time to fire up the game (the scenario in question is from battle pack one) and check the framerates for that particular scenario that would be very interesting to see. Thank you! 

  2. 3 minutes ago, c3k said:

    Freesync is free. AMD only charges a few cents for screen manufacturers to use the licensed technology. The "cost" comes in when you realize that only AMD cards can utilize Freesync. Nvidia charges ~$200 (+) for the Gsync license fees. Those screens show that price. Again, only Nvidia cards can use Gsync. (Some of the Gsync monitors are 144Hz. Those are more expensive than vanilla 120Hz and then they also add in the Gsync fees.)

    CPU. Although the game is cpu bound, there is more to it than just "does my cpu meet minimums or recommended".  CM, like most games, is not even close to multi-core capable. It has offloaded one of its subroutines to a second core (if available), but most of the game runs on a single core. (This is a simplification...because that's about the limit of my understanding. ;) ) Core speed is more important than multicores or thread throughput. (Most cpus are pushing multi these days, and bragging on it. None of that helps CM.) So, a good core speed is more important than more cores. (I wouldn't put less than 4 cores/4 threads in anything these days. Gotta let some background stuff run...) In addition to the cpu core speed, the data throughput is important. I don't know what that means wrt CM. I do know that "better" rigs run CM faster. I don't know if bus width, memory controller, RAM speed, RAM size, swap file, cache size, pipelines, or anything else matters more than any other.

    All else being equal (big weasel words there), an AMD 12 core/24 thread "Threadripper" (gotta love their marketing names!) at 3.5GHz will not run the game any faster than an i3 2 core/2 thread cpu at the same 3.5GHz. AIUI. Maybe some day...

    A balanced system with emphasis on data manipulation rather than video creation will provide a better CM experience. IMO.

    Auuuch, I was only checking the Gsync monitors thinking Freesync would fall in the same pricerange. Ugh. Well, good to know, thanx! 

    Suggested CPU is Pentium IV 2.8 GHz. My AMD 6300 FX has 3.5 GHz! So it must be something else then that.  

  3. 28 minutes ago, SLIM said:

    That's because Op Linnet II isn't a scenario, it's a torture chamber, designed by an unrepentant sadomasochist. ;)

    Ha ha, OK, I will steer clear of it then. :D

     

    25 minutes ago, IanL said:

    LOL, good thing you don't know where we live then. :)

    Hmmm, I'll just ask the prime minister Justin of your whereabouts - he seems a nice enough guy to help someone like me with my mission. ;)

     

    10 minutes ago, c3k said:

    Your buddy who stated that the 2560 vs 1920 screen shouldn't be that much of a strain...must be talking about other games. Those games may be cpu bound, not gpu bound. In that case, the 33% greater load your widescreen places on the gpu is not the limiting factor. However, be sure that 33% greater pixels on a screen is, and will be, a 33% greater load on the gpu. Every time. It may not cause a 33% change in fps...but that's because of other factors limiting the framerate.

    OK, got it. 

    10 minutes ago, c3k said:

    For CM series, as IanL stated and you've discovered, there's not a lot of link between gpu and framerate. This game is MUCH more dependent on data throughput than other games. On one of my rigs I've got CM on a hard drive. A WD Black. This particular rig has 64GB of RAM. That doesn't matter. The load times are ridiculous. I cannot imagine the amount of data being pulled off the drive and manipulated. Putting CM on an SSD makes a HUGE load time difference. That gives an inkling of the data being pushed around.

    Yes but you see, my CPU is much better then the one suggested in system requirements... That's the whole gist of it.

    When I got me a SSD loading times were much shorter for the game yes. 

    13 minutes ago, c3k said:

    For smoothness, I've found that using Freesync/Gsync gives great results with CM. Those low frame rates don't seem so slow when it's all linked.

    Well, I bought my ultra-wide this year and  the ones with Freesync/Gsync were way too expensive. It would be really cool to own one though - interested in seeing how it works. Knowing what I know now I would also probably not buy 1920x1080 IPS with  Freesync/Gsync because that would still be 3 times more expensive then what I splurged on the current one and I'm loving all the extra monitor space. Plus I didn't know back then I still don't know now if I'll buy me an AMD or Nvidia GPU. Do newest AMD gpu's still experience problems with CM games? 

  4. 30 minutes ago, IanL said:

    The engine we have is what we have. When you are specking out your new machine some day I would recommend spending a bit more on the mother board (in other words get one that has a faster buss rather than the cheapest available) and focus on a better CPU over a better GPU. Obviously there is a quite a steep curve at the top end of CPUs and GPUs all I mean is if you have x dollars to spend on a CPU and a GPU spend more than half on the CPU since extra $ there will give you better bang for you buck. Also it seems that you might be better off getting a higher GHz i5 than a slower GHz i7 since CM is mostly single processor.

    OK, I'll resort to the help provided by you guys suggesting what motherboard, CPU, RAM and GPU to get when the time comes. I'll know who to blame if my performance won't be OK. ;) 

  5. OK guys, here I am. 

    I "undusted" the two old monitors I have. Their native resolutions are 1600x1050 and 1280x768 and I tested both scenarios on them.

     

    BP1 The Copse scenario:

    The 1600x1050 monitor gave me 24 frames in that same camera position with all the same ingame settings. 

    The 1280x768 monitor gave me 29 frames in that same camera position with all the same ingame settings. 

     

    BP1 Op Linet II a scenario:

    The 1600x1050 monitor gave me 7 frames in that same camera position with all the same ingame settings. 

    The 1280x768 monitor gave me 7 frames in that same camera position with all the same ingame settings. 

    I also tried lowering the ingame settings to fastest/fastest on both monitors in this scenario but to no avail - framerate remained at 7/8. 

     

    18 hours ago, Battlefront.com said:

    Actually, they do :D  It's a simple thing... the more polygons shown on your screen, the more processing power it takes to put them there.  Seeing more of the battlefield is what you get when you pump up the screen resolution, therefore you put additional strain on your system.  It's as much of a law of physics as is gravity.

    Steve

    Well, I have a friend in UK who has 1920x1080 and 2560x1080 and told me he experienced no strain to little more strain on 2650 over to 1920 on the same games but that was probably because his GPU wasn't maxed on 1920 if I think about it, because if it wasn't then the results he experienced would probably not be the same especially after what you are saying Steve is a common law of physics.  But then again if your game would follow the common laws of physics then all more powerful rigs would have to churn out more frames out of your game but that is not the case because there are all sorts of bottlenecks present on either side that you already stated on numerous occasions. See! ;)

    I read people claiming the strain to be from 5% up to 30% more on 2560 vs 1920 depending of the game tested and gpu used so it is not that you would for sure get 33% more strain and thus 33% less frames in every game on every GPU due to there being 33% more pixels to churn out. 

     

    OK, back to the results of the test. The 1280x768 monitor gave me 40 frames in that same camera position with the 3D models and textures ingame settings lowered to fastest/faster. When panning the camera around it went up to 60 frames in that scenario and mostly kept at 50. Still there's no way I would swap my current monitor for that old square abomination of a monitor, he he. I prefer to play it on current one with ingame settings set to fastest/faster since I discovered it gives me above 30 frames in  most situations (except in the biggest scenarios of course) in exchange for more blurry textures and very close drawing line (which is less pronounced due to all the textures being more blurry so does not pop up as much anyway).

     

    Since I am getting low frames on the biggest scenarios regardless of the graphical options used then the bottleneck might as well not be the GPU but the CPU. Would also explain why George MC who gave me the GPU wasn't experiencing the same issues on his rig. I also got 3 frames on the Arnheim scenario at the moment when my British paras all together started to shoot on the advancing Germans on the bridge while before the shooting started my frames were much higher. Maybe the game and AMD 6300 FX just don't go well together. Or certain combos of AMD cpu's and nvidia GPU's clash with thew way game calculates things. I don't know. I would like to know. Could there be any other faulty hardware piece I have like the RAM not being of the right frequency or something? The system states I have 8 gigs of RAM installed so it's not that I've put them in in the wrong slots or only one would be working.

    Now is not the time to be buying a new GPU due to prices being artificially mined and am awaiting the new technology to arrive in February or so (then miners can go f*** themselves).

    I re-read all the forum topics regarding the performance issues people are experiencing and discovered a dissonance between what developers deem acceptable game performance and what certain users see as such (not those over 70 who can't spot a difference between 15 and 60 frames on their screens, lol). Knowing that 20 to 30 frames is what you must expect from the game makes one more at ease. When I'll get older i might also not spot a difference in screen fluidity and by then I guess I'll be fully satisfied with the way CM games perform. ;)      

     

     

  6. Heya guys, thank you all for the help!

    I stated already that I had lower resolution monitors in the past on the same two rigs on which I played CM. I was experiencing the same issues already with the two smaller monitors but for the heck of it I will plug in smaller native resolution monitor and do the testing.

    Note that ultrawide monitors don't need 33% more power then classic 1080p monitors in practice (normal math doesn't apply here) but I regress, let's wait for the test results to come in.

  7. Quote

     

    System Requirements MINIMUM:

    • Operating System: Win7/Win8
    • Processor: Pentium IV 1.8 GHz or equivalent speed AMD processor
    • Video Card: 256 Megabyte dedicated VRAM or better and must support 1024x768 or higher resolution in OpenGL (Intel integrated video may produce blurry text)
    • Sound Card: DirectX 10 compatible Sound Card
    • System Memory 2 GB RAM
    • Hard Drive Space: 3.5 GB
    • Other requirements: DVD drive (for hardcopy version only)
    • The game does not work in a virtualized environment (virtual machine)

    System Requirements SUGGESTED:

    • Operating System: Windows 10
    • Processor: Pentium IV 2.8 GHz or equivalent speed AMD processor or better
    • DirectX 12 compatible Sound Card
    • Video Card: 1 GB dedicated VRAM or better and must support 1024x768 or higher resolution in OpenGL (Intel integrated video may produce blurry text)
    • System Memory 4 Gigabyte or more RAM
    • Hard Drive Space: 5 GB
    • Other requirements: DVD drive (for hardcopy version only)
    • The game does not work in a virtualized environment (virtual machine)

     

     

    These are minimum and suggested system requirements stated on Battlefront store page for CMBS (other games are the same, only hard drive space differs naturally) - 4 gigs of RAM is indeed suggested. 

    I have much better AMD CPU then the one suggested and 100% more VRAM and am getting those issues and can only play up to large missions with 3D models and textures set to lowest. That does not bode well with the suggested requirements. Does the suggested system requirement mean you can max out all the ingame settings and play any stock scenario with 20+ frames? Or does this mean something else entirely? If they would be correct I shouldn't be getting so low frames (that is if my rig does not indeed have an issue). Note that most of the games I own pose no particular issues like the CM games do with a few exceptions (but admitted by the developers the problem was on their side and not on mine - AGEOD games, Ultimate General Civil war for example). I know, probably none that I own are built upon OpenGl chasis.  

    Anyway, if there are any other insights to this I am all ears, I'm resorting to playing the game on the lowest settings until I buy a new rig and see if I'll have more luck for the 3rd time. :D 

  8. 3 minutes ago, IanL said:

    That is not what that means.

    If as you know there is much more to a smooth experience than memory, or graphics card or CPU it is the combination. If you were to spring for a new Mac I am certain you would experience much better performance but you could do that with new PC hardware too. The nice thing about Macs is you are sure to not get a system with a week link. But you pay a premium for the system.

    OK, thank oyu for the explanation.

    Ha, I was never into Macs plus these things cost a fortune over here. I have 80 Steam games and 14 non Steam ones (Uplay, GOG, etc.) which are mostly only for PC and not for Mac. There's no way I would swap it just to make sure CM games play well. :)

  9. Thank you very much IanL for taking the time for the test! When you updated your Nvidia drivers you probably left the "clean install" option checked and that is what makes all the custom made game profiles go away. Don't forget to uncheck it next time. :) 

    The difference in your testing is that you have antialias/multisample set to off (I had it turned on), vertical sync is on off (I had it on) and you use shadows (if I turn them on the framerate plummets a lot so I don't use them plus they look way more jerky on this newer GPU then they have looked with my previous one).

    Let me do the test with your settings (balanced/balanced/vertical sync and antialias/multisample on off/shadows on). I get 18 frames. When I turn off the shadows I get back on 22 frames which means that vertical sync set to on and antialias/multisample set to on gives me precisely the same framerate so these two do not affect the framerate in this particular scenario. This holds true also when I lower the 3D model and texture quality - sync set to on and antialias/multisample set to on do not dip my frames when compared to being set on off at least in this scenario. Got yet to test this with bigger scenarios. 

    I can play The Copse scenario on 37+ framerate when I set the 3D model quality to fastest and 3D texture quality to fast. 

     

    Can anyone confirm to have more then 9 frames on BP1 - Op LINNET II (a - USabn UKgnd).btt scenario when facing all the units? 

     

     

  10. 3 minutes ago, rocketman said:

    @Hister If you haven't already, try the Nvidia program "Nvidia Inspector" to fine tune graphic card settings. I had progress using it, especially capping FPS to 30-35, which smoothes out the variation a bit. I'll try to dig up the settings I have in case you want to try it.

    Sure why not! Thanx. 

    Still, something is way off and I doubt Nvidia Inspector will make it go away. I fired up the biggest map scenario in CMBN and got 9 frames while the visuals themselves were degraded immensely. Take a look for yourself. Plus it was very hard to move around with the camera...

    CM Normandy 2017-10-21 23-59-54-48.jpg

  11. My more detailed specs:
     
    Power Supply: 550W XFX P1-550S-XXB9  
    Motherboard: Asus AM3+ M5A97 (970 ATX) [90-MIBFSO-G0AAY007]  
    CPU: AMD FX-6300 BOX 3,5 GHz - 14MB Cache - 95W with stock cooler 
    RAM: DDR3 1600 8GB CL8 Corsair 2x4GB Vengeance CMZ8GX3M2X1600C8B 
    Hard Disk(SSD): Samsung 840 Pro 128GB (MZ-7PD128BW)
    11 minutes ago, kklownboy said:

    to derail this thread a bit more...  Hister and IanL you forgot the most important part,

    your Motherboards.... Brand, Type and firmware version.

     I don't know where is my original mobo box - firmware version is written on it right? Or can I also find it on the motherboard somewhere? 

     

    2 hours ago, rocketman said:

    I think there is a setting in FRAPS to turn on/off FPS display and where on the screen it is shown. Try that.

    Correct,thanx! I did turn it on but the screenshots now say I had 0 frames while I had 22 in reality. Anyway, I do think you guys don't think I'm making this all up. I get 22 frames when in that position. The thing that baffles me is that when I start that scenario and just lower my camera from the original position to get to where the screenshot was taken I always have higher framerate (around 30 or more) but when I pan the camera around my starting units always having them in sight when I get back to the original position then I get much lower frames - 22 in this case. If I then press the windows button to get me in my desktop and click on the game again my frames go up again to where they originally were when I started the game. I make another circle around my units and frames drop again to 22. Looks like a memory leak of some kind to me, doesn't it? It is the same with every map I play, as soon as I start looking around the frames drop from the starting ones.       

    My nvidia control panel settings:

    Everything that can be application controlled is application controlled, FXAA is OFF, gamma correction is ON, antialiasing transparency is OFF, power management mode is max performance, shader catch is ON, anisotropic sample optimization is on OFF, negative lod bias is on CLAMP, texture filtering quality is set on QUALITY, trilinear optimization is OFF, triple buffering is ON (due to me using ingame vertical sync).

     

    26 minutes ago, IanL said:

    No, no not at all. What I thought was that you had the wrong idea about the prevalence of what you were experiencing. I get it. Some may recall I had a lot of trouble with movement orders near bridges. I reported the problem several times. Finally I posted a are you guys ever going to fix this post and the response was fix what. It turns out it was very nearly only me that had the problem. I worked with Ken and Steve and we figured out the issue and the fix.

    Ha, crazy, I remember this episode of yours. You must have felt very special... ;) 

    34 minutes ago, IanL said:

    Yeah but there are two classes of issues (my categories) those that think the game is broken if they cannot get 60 fps. Clearly they are both out to lunch and never going to be satisfied. Those that get sub 10 fps and stuttering camera controls. Typically they find game and card settings that resolve the issue. I am not aware of people who cannot get a reasonable frame rate to play. Hence I was surprised by your post. There simply are not a ton of people who cannot play the game due to frame rates. Again not counting people with unreasonable expectations.

    Note I am not saying that performance is perfect by any means I am just saying that people have mostly been able to get to good enough.

    I know, some people have the 60 frames idea embedded in their head and anything less on their powerful rigs is an insult. I am not one of them. I would be happy if I would be getting 30+ frames on balanced/balanced in the vanilla stock scenarios/campaigns/maps no matter where my camera is looking at. 

    35 minutes ago, IanL said:

    Sadly for me my internet is down which means I am spending my day with support and replacing hardware. So, posting on my phone and not able to post screen shots.

    What you are starting is a good strategy. 

    I have an Intel i5 4670 3.4Ghz with a GTX 760 card. I'll have to report back with the frame rate I see tomorrow ish.

    I am not sure if my cpu is more or less powerful but my graphics card is. I am not sure if there are any testers with similar cards. I'll have to ask.

    Thank you very much Ian for the engagement, if you do find someone do let me know, so that we can compare frames. Your CPU is also more powerful. Mine would be in the range of Intel's  i3's. 

  12. Thank you IanL and especially Steve to have taken the time to directly answer me, much appreciated! 

    Ha ha, I must have came out as a little crybaby, not my original intention. Although I admit that I am annoyed by the situation I am experiencing and at unease after Steve's reply. 

    I am hardly alone experiencing this as the forum testifies (there have been a bunch of performance issues posts in these years by different people sporting different hardware) but what is the percentage of players that gets hit with very poor performance is beyond me and by your answer Steve also probably beyond you (tech help requests might give  a better inside knowledge on this then us outsiders can ever have).

    Another layer of goo with this is that for someone having 15 FPS is just fine when for others like me is a no go. I am able to clearly see a difference in frames from 10 to 20, 30, 40, 50 and 60 with my eyes. Must have something to do with the way eyes receive and brains process the image - we ain't all made equal. I get a very not nice physical feeling inside me when frames drop below 25-23 and the worse it gets the lower the frames. I don't feel good after watching such low frames for even a short timespan.  I haven't checked if there is any medical reasoning behind this. I get irritated and drop the scenario or try to avoid camera angles that drop the frames too low. That said I don't play this game from top down like many do, I like to be "personal" with them troops, having the camera on the ground and trying to see what my soldiers see). :)

    I also know the game isn't made to be played on 60 frames (those frames I get when I pan the camera towards the map edge and have only a few units and trees in my view, oh the visual smoothness!) but as soon as I pan camera around the frames drop dramatically even on tiny scenarios. Having 20 to 30 is considered a normal for these games and I can play without any issues when frames are above 25, 30. I have been around CM for many years now and have been monitoring the forum and often also engaged in different debates.

    So I have had 2 different hardware combos on which I have played the CM games (both on AMD CPU's and Nvidia GPU cards - I never had an Intel CPU and AMD GPU combo).  Both rigs have been giving me the same low frames issues. At this point I still have expectation that the game will run fine when I buy myself a new rig but am trying not to get myself too disappointed if this doesn't realize since it seems more to be a hit and miss thing then a "this rig combo or this CPU or this GPU will give you problems".  

    The current GPU I have was kindly given to me by Baneman (who himself didn't experience the low framerates with it). It is a GTX 660Ti. Previous one that I had was GTX 550Ti (ingame shadows were drawn much more nicely with it then with the current one but I am getting no increase on frames with it). 

    My CPU is AMD's 6300 FX. I have 8 gigs of RAM. Games are installed on SSD and on a hard disk. I have windows 7. Latest Nvidia drivers. All computer device drivers are the latest.  

    I have Juju's mod installed but experiencing the same issues also without it. 

    Ingame settings are set to balanced, balanced, no shaders and no shadows, tree detail is low, vertical synch is on, high priority process is off (no change if I set it to on). I play the game on native resolution which is 2560x1080. I have played it on lower resolutions with my old monitors with exact same settings and experienced 2 or 3 more frames on the same scenarios only so not a drastic change.

     

    I have CMBN, CMRT and CMBS, all modules and packs (minus the newest one for CMBS that hasn't been officially announced yet).   

    I started up The Copse which is a tiny map scenario with 3 tanks, 2 halftracks and one Kangaroo, all in all 6 vehicles  and 1 platoon of units sitting inside the vehicles.  

     

    I position my camera on the ground level just behind my allied units looking towards the enemy positions. I get 22 frames per second.  See attached screenshot. I don't know why frames are not shown on the screenshot, ussualy they were. I am using Fraps to measure frames.  

     

      

     

     

    CM Normandy The Copse scenario 22 frames in this camera position.jpg

  13. I for one would above everything else like to see the patch that fixes the odd performance issues some of us have with the CM games. It should be high on the priority list me thinks. It can't be that some older hardware combos produce much more smoother gameplay (higher frames per second and no stuttering) on the same screen size then newer, more powerful components for example.

    It should have been acknowledged by you devs and tried fixed. I don't know what is the percentage of the players that are affected but in my case I have mostly stopped playing your games due to having very low framerates on most of the scenarios (below 20 frames, sometimes only as low as 3 frames), also the tiny sized ones. 

    Cheering to see something being done on this front eventually. :D

        

  14. Armorgunner, having the latest hardware is not a guaranteed FPS success. Many players who have gotten new comps have complained about their CM game running on a snail speed. Just a warning so that you don't feel too disappointed when the performance isn't up to what you expect to get out of the new comp.

    Older gaming rigs often reportedly play CM better. But it's not consistent, experience varies a lot so it a shoot and hit or miss mission really.  

×
×
  • Create New...