Pak40 Posted September 3, 1999 Share Posted September 3, 1999 Upon reading an article about the new graphics chip from nNivia, I got to thinking about the future of CM and its graphics. It seems that this new chip will move almost all of the graphics processing to the video card, thereby freeing up the CPU and the conventional RAM. Here's one of the quotes from the article found at www.cdmag.com: "Transforming and Lighting polygons are the single most CPU-intensive parts of current 3D games. Moving these calculations from the host CPU to the graphics card frees up an incredible amount of CPU time for more advanced AI, physics, and game logic. " Steve, this sounds good for the future of Combat Mission, do you currently have any plans to incorporate this technology? Link to comment Share on other sites More sharing options...
Guest PatB_TGN Posted September 3, 1999 Share Posted September 3, 1999 Hey Pak40, Game companies generally do not use 'upcoming' technology for title development. They use whatever is available and popular at the moment. How many games are fine tuned to MMX? *grin* Or for that matter, the vaulted 3DNOW architecture? There's more of a market for Glide and OpenGL that anything else (at the moment). I'd rather see the game ported Linux so I can have an excuse to get that dual celeron system I've been dreaming about. -Patrick Link to comment Share on other sites More sharing options...
CoolColJ Posted September 3, 1999 Share Posted September 3, 1999 Pak 40 The cpu is still the limiting factor, until cpu and OS go at least 64 bit (windows is still 32 bit and not all of it!), and preferably 128 bit like the Playstation 2! The G4 mac is 128 bit, when it comes out, hopefully BTS will upgrade to one CCJ Link to comment Share on other sites More sharing options...
Pak40 Posted September 3, 1999 Author Share Posted September 3, 1999 Pat - I was actually referring to follow up Combat Mission games, not the one currently in devopment. I know that it's impossible at this point to convert the current CM to comply with the GeForce standards. CoolJ - True, the CPU is still a limiting factor. But if you put ALL graphics processing and memory used for graphics on to the graphics board then you free up the CPU to handle things like AI and calculations. The entire concept behind the GeForce 256 is that it is actually another CPU. They actually call it a GPU(graphics processing unit). Read the article, you will understand it better. Link to comment Share on other sites More sharing options...
CoolColJ Posted September 3, 1999 Share Posted September 3, 1999 PAK40 Cm does all the AI and calcaltions separately from the movie showing, so it wouldn't make a difference. What it will allow is better and higher quality playback, possible more polygons and better frame rates. But I suspect only with OS support will the graphics be totally independant from the other cpu functions, even now Windows still doesn't do real pre emptive multitasking like my old AMIGA! Which had a separate blitter chip for screen movement etc, Though there is much better hope of this with a MAC, damn, I wished Commodore never went broke CCJ Link to comment Share on other sites More sharing options...
Guest Ssnake Posted September 3, 1999 Share Posted September 3, 1999 That's all your fault ! You better had bought more of those machines !!! <vbg> Link to comment Share on other sites More sharing options...
Guest Big Time Software Posted September 3, 1999 Share Posted September 3, 1999 Ssnake, spoken like a true German Pak, we welcome ANY new technology that can give us better frame rates and more polygons at the same time. Each forthcoming "standard" claims to be the best, and well, that really is a tall boast to make since it has yet to be seen. But we won't support ANYTHING until it becomes a significant standard. As Patrick said, we could have wasted a LOT of energy developing for technologies that never made it (hell, Sierra even had their own 3D card for a while...) One thing IS for sure though. Whether it is nNivia or some other "standard", hardwar will get better and faster and one or two will be more standard than others. We just have to wait for the horses to get into the final stretch to make sure we don't bet on the wrong one (MMX anybody? ). Steve Link to comment Share on other sites More sharing options...
Doug Beman Posted September 4, 1999 Share Posted September 4, 1999 HEY! I bought one of those MMX systems. Regretted it to no end, finally made right with myself when I bought the Celery/mobo/RAM combo and asked my brother in law to whip it all together. Sad thing is, pretty soon I'm gonna need MORE RAM (64 now) to keep up with games. Nature of the beast. DjB Link to comment Share on other sites More sharing options...
krm Posted September 6, 1999 Share Posted September 6, 1999 Whats the difference between an MMX system and a "normal" one? I see many advertised now. e.g P3 450 MMX etc. Also, what role do motherboards play in the graphics debate?(if any) Link to comment Share on other sites More sharing options...
Doug Beman Posted September 6, 1999 Share Posted September 6, 1999 Here's what I think it is. MMX was a specialized instruction set which Intel built into its chips after about late '97. MMX was geared specially to handling graphics calculations; things like floating-point integers and stuff like that. Intel was able to determine what sorta calculations are needed for most graphics in 3D games and tailor-build some pieces of the chip (the MMX part) to be better at those calculations. The MMX had no effect on things like spreadsheets or things like Adobe Photoshop (for those getting a PII is the thing to do) The drawback of MMX is that games had to be tailor-programmed to recognize and utilize the MMX part of the chip, much in the same way that games in the early part of 3D-accelerator cards had to be tailor-made for one card type or other. Not many game companies took the extra time and money to tune games for MMX, since the performance increase was less than you'd get with a good 3D card, and MMX was nowhere NEAR an industry standard. As for mobo, those play more of a role in programs that do a lot of pushing of stuff from hard drive to RAM to CPU and out again. The effect of a good mobo is to speed to transfer of all data that have to cross over the mobo for some reason. Of course, if you get a mobo with AGP and an AGP graphics card, you'll see a noticeable increase in graphics performance. DjB Link to comment Share on other sites More sharing options...
Guest David Harrison Posted September 6, 1999 Share Posted September 6, 1999 Is the GeForce 256 the same as the "NV10" that I hear will be released this fall? David Link to comment Share on other sites More sharing options...
kingtiger Posted September 6, 1999 Share Posted September 6, 1999 David, They are one in the same. Also, expect an all together new chip every six months after it is out. The GeForce 256 is quoted as being four - ten times faster than a pentium III 550 on simple to complex lighting scenarios. Wait till 2 weeks after christmas and buy whatever you need. Now is not the time to buy a new card. ARTICLE on GeForce 256: http://www.zdnet.com/pcmag/stories/trends/0,7607,2326894,00.html Steve stated, "Each forthcoming "standard" claims to be the best, and well, that really is a tall boast to make since it has yet to be seen ---> Well I updated from a TNT Viper 550 to aTNT2 Viper 770 Ultra, and although it is better and smoother and faster, it clearly was not a wise use of a couple hundred bucks. I could have bougtht three more copies of CM and gotten the other Panther print, the King Tiger and the Tiger Print. Also, not much difference between an Ultra TNT2 and a 3DFX card from my experience. Either TNT2 or 3DFX will excel for games optimized for them. Steve makes a great point. Even nVidia is coming out with a new program to release newer faster chips every six months. The technology is exciting and will surely prove to take 3D video gaming to a new level, but Bitboys has a product in BETA for developers that threatens to displace both 3DFX and nVidia. It is 512 bit and nVidia is 256 bit. AND it was announced August 2, 1999. Here is an article: http://www.bitboys.com/a_pr_glaze3d.html Link to comment Share on other sites More sharing options...
Recommended Posts