• Announcements

    • CHIMM

      RAT Chat Sunday 12/8 3pm server time!!!!   12/07/2019

      CRS is working overtime preparing and setting up the NEW SERVERS at the Portland colocation. This Sunday, December 8th, at 3:00 pm CST/9:00 pm GMT (Greenwich Mean Time). XOOM and the RATs are hosting a live chat discussing the move, and what services will be temporarily impacted in the process. Live chat link will be provided in discord channels when available. We look forward to chatting with you!
Sign in to follow this  
Followers 0
vanduzer

Doc, Fillrate question?

48 posts in this topic

That is a really good point. I'd like to get a better framerate in the game but in air combat I am getting 40-60FPS (on minimum settings).

For me the burning question is "what do I need to get good enough performance to record video with PlayClaw or Fraps?" I've been wanting to do so for years (in particular for instructional game vids), yet with this computer I've fallen flat:

  • AMD 925 Phenom II x4 quad-core processor (2.8GHz)

  • 4GB DDR3 1333 RAM

  • nVidia 9800 GT (512Mb) graphics card

  • M4A785TD-M EVO motherboard

  • 300GB PATA HDD for system and applications plus 1TB RAID0 (2x500Gb drives) on SATAII (3Gb/sec) interface

I get just 9 to 11 FPS recording video. Where is the weak link - the vid card? Would I be better off just buying a cheap DDR5 card, or do I need to wait and buy a HD 6950? Whatever card I get needs to last me a couple years at least...

What resolution?

Not sure why Playclaw would be killing it that bad considering your system specs. Either there is a driver issue or there's a hardware bottleneck at the vidcard.

I've a budget system:

i5 750 intel

2gig ddr3 1333

Biostar T5xe mobo

ATI 4860

single 300gig sata drive

1680x1050 with most settings on high I can still run playclaw and hold over 30 fps around contested towns.

I am on 1680x1050 resolution, 32-bit color. Everything else is turned off or turned down to minimum settings (except tracer smoke and muzzle flash).

By the way, the 9 to 11 FPS framerate was over open sea at 3km and only 3 planes in sight. That is freaking unplayable.

...

Wanna hear sumpin' funny? The 9800GT texture fill rate is about 34 GT/sec and the 460 is around 37 GT/sec - nearly identical! And the old 9800GTX+ pumps out 47GT/sec - more than the best of the 400-series! Look here: http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units

One fella told me that the DDR5 RAM makes a big difference over the older DDR3 on board the 9800GT. But performance numbers are listed as nearly the same between my card and the 460. So which do I believe?

Unfortunately, the series number, especially the first digit in the 4-number series designations, is less and less meaningful.

I recommend always checking Toms Hardware.

Looks for the most recent "Best Graphics Cards For the Money" article (they do one each month).

http://www.tomshardware.com/reviews/best-gaming-graphics-card-geforce-gtx-590-radeon-hd-6990,2879.html

Check the last page of those articles, which is the Graphics Card Hierarchy Chart

http://www.tomshardware.com/reviews/best-gaming-graphics-card-geforce-gtx-590-radeon-hd-6990,2879-7.html

BLOO - note that I wasn't making much over the model numbers, but instead just named them to contrast the texture fillrates that I looked up on all of the on the Wiki comparison charts (link provided in original post quoted above).

If "[texture] fillrate is king" for WWIIOL, then Tom's Hardware comparisons are too general. Zeroing in on the texture fillrate numbers should provide the best comparison, right? So does the type of graphics RAM - GDDR3 versus GDDR5 - make any additional difference? And then there is the amount of RAM - typically a comparison between 512MB, 768MB, and 1GB. How much difference does this make? Would it be worth grabbing a card (that is used only for this game) that has 1.5, 2GB, or more GRAM so that the card performs better now? And how about the future - with an eye for coming changes to the game? When I pony up for a graphics card, it had better last me at least 3 years.

I'd love to make some instructional (particularly bombing) videos for the game, but I cannot afford the gear for that. This current machine is the closest I've ever come to it I think - how far from it am I now? Last time I made an effort was 6 years ago when I spent $1,650 getting a machine that would be capable, but it wasn't. In fact it was only competitive for air combat without recording video about 6 months - after that I was just farther and farther behind every new patch. If it wasn't for wealthier players tossing me their hand-me-down gear I'd have been forced to quit a couple times in the past.

How much do I need to budget per year to keep playing this game at, say, the 80th percentile level of the air war?

...

Edited by oyaji

Share this post


Link to post
Share on other sites

I don't have any solid info on ram speeds and the like for you.

One factor with our game is that CPU is often just as much a choke point of performance and GPU.

The model numbers thing is a peeve of mine - and people who buy a high first digit thinking that guarantees better than all lower numbered units.

Share this post


Link to post
Share on other sites
  • AMD 925 Phenom II x4 quad-core processor (2.8GHz)

  • 4GB DDR3 1333 RAM

  • nVidia 9800 GT (512Mb) graphics card

  • M4A785TD-M EVO motherboard

  • 300GB PATA HDD for system and applications plus 1TB RAID0 (2x500Gb drives) on SATAII (3Gb/sec) interface

I get just 9 to 11 FPS recording video. Where is the weak link - the vid card? Would I be better off just buying a cheap DDR5 card, or do I need to wait and buy a HD 6950? Whatever card I get needs to last me a couple years at least...

...

http://www.nvidia.com/object/product_geforce_9800gt_us.html

Well a 9800GT should be good enough for this game to run on. with 112 SPU (cores) and 512MB DDR3-256-bit (bit rate plays a important role in bandwidth for the memory)

But your saying your fraps recording is where the issue is at.

The SATA interface isn't really what you should be paying attention to. or the rpm rate (kudos on knowing the term PATA) What you would want out of a fraps recording drive is a high cache. typical low end cache on a standard style platter HDD is 16MB, Middle range is 32MB and the "nice" drives have 64MB or 64+64MB cache on the uber drives.

That cache makes a hell of a difference, with my 16MB cache pata drive. I'll get stu-stu-studders while my 64MB drive records smoothly.

also another NEW feature with fraps is the video caching. Hold your video record button down and the FPS indicator will turn from yellow to hot pink (not red) caching the video instead of raw recording it. when something cool happens, double tap your record key again and it'll save the previous 30 seconds of video, single tap your key to record the previous 30 seconds and continue recording what your doing now, tap again to stop.

it's a nifty feature.

My fraps recording as far as ww2ol goes worked exactly the same between my 8800GTS SSC (also 112cores) and my ATi Radeon HD 5770.

Heres a fraps video i recorded recently.

-0E2KFoU1C0

another video of the DD fight awhile back

2DNdFBVYzL8

You can see they both run fine.

AMD Phenom II X4 955 @ 3.7Ghz/4096MB DDR3-1600/ATi Radeon HD 5770

I'd check out the specs for your hard drive(s) and see what the cache is, then post back here.

[edit] theres also things you can do like setting your processor affinity so FRAPS won't use the core(s) that ww2ol.exe is using. and setting your fraps to record to a drive that ww2ol isn't using at the moment.

overclocking that phenom II might also help (if its a black edition)

and oh yea, pound for pound i've had less issues with my ATi card then i've had with my previous Nvidia cards. And this is only my 2nd ATivideo card, the one before that being a 9600pro

Edited by indo420

Share this post


Link to post
Share on other sites

By the way, the 9 to 11 FPS framerate was over open sea at 3km and only 3 planes in sight. That is freaking unplayable.

...

I'm assuming you've tried various settings in playclaw?

Here's their most recent settings tutorial:

http://www.playclaw.com/article-video-capture-settings.php

I had to play around with the Compression/Frame size/Frame rate/Cores settings until I found a satisfactory balance between in game fps hit and video quality.

I suggest adjusting the settings until you get the least FPS hit, check the video quality, if it isn't satisfactory then work backwards with the settings from there until you find a good balance.

Share this post


Link to post
Share on other sites
http://www.nvidia.com/object/product_geforce_9800gt_us.html

Well a 9800GT should be good enough for this game to run on. with 112 SPU (cores) and 512MB DDR3-256-bit (bit rate plays a important role in bandwidth for the memory)

But your saying your fraps recording is where the issue is at.

The SATA interface isn't really what you should be paying attention to. or the rpm rate (kudos on knowing the term PATA) What you would want out of a fraps recording drive is a high cache. typical low end cache on a standard style platter HDD is 16MB, Middle range is 32MB and the "nice" drives have 64MB or 64+64MB cache on the uber drives.

That cache makes a hell of a difference, with my 16MB cache pata drive. I'll get stu-stu-studders while my 64MB drive records smoothly.

also another NEW feature with fraps is the video caching. Hold your video record button down and the FPS indicator will turn from yellow to hot pink (not red) caching the video instead of raw recording it. when something cool happens, double tap your record key again and it'll save the previous 30 seconds of video, single tap your key to record the previous 30 seconds and continue recording what your doing now, tap again to stop.

it's a nifty feature.

My fraps recording as far as ww2ol goes worked exactly the same between my 8800GTS SSC (also 112cores) and my ATi Radeon HD 5770.

Heres a fraps video i recorded recently.

[and another one]

You can see they both run fine.

AMD Phenom II X4 955 @ 3.7Ghz/4096MB DDR3-1600/ATi Radeon HD 5770

I'd check out the specs for your hard drive(s) and see what the cache is, then post back here.

[edit] theres also things you can do like setting your processor affinity so FRAPS won't use the core(s) that ww2ol.exe is using. and setting your fraps to record to a drive that ww2ol isn't using at the moment.

overclocking that phenom II might also help (if its a black edition)

and oh yea, pound for pound i've had less issues with my ATi card then i've had with my previous Nvidia cards. And this is only my 2nd ATivideo card, the one before that being a 9600pro

I don't have to look it up - those are Western Digital 500GB drives each with a 32MB cache. As I mentioned, they are in a RAID0 on 3.0GB/sec SataII, and the only thing they are for is recorded video (meaning they haven't ever been used for anything except 2 trial of video recording). I thought that would be a pretty good setup, but I ended up disappointed.

I used a 2-week free trial version of Playclaw for my recording attempts. It was about 8-10 months ago that I last tried recording and then gave up in frustration. I don't remember for certain if I tried setting the affinities of WWIIOL and Fraps to 2 different pairs of the 4 cores in my processor, but I am pretty sure I did - assigning affinities has been a habit of mine for about 5 years so that I could always try to squeeze best performance out of the game using bottom-end equipment (until last year I was still using an 8xAGP nVidia 6800GT card - and only went to PCIE when AGP became out of date for the game).

The AMD 925 is not Black Edition. The motherboard is fairly basic too, but does allow some sort of rudimentary overclocking. I figured I might have to learn a bit about how to get some extra boost out of that when I have run out of other options and when I have a better heatsink than stock (I was thinking one of those Zalman "hoop-style" jobbies would be a good way to go... they look like they might do a better job mounted horizontally than the typical rectangular heat-pipe layouts.) How much did overclocking your 955 from 3.2GHz stock up to 3.7 do for you?

I am wondering if the vid card memory at 512MB is the problem, or if it could be the Northbridge (it does not have heatpipe cooling, only a rudimentary finned aluminum heatsink). Your 5770 has 1GB of video RAM - does the 8800 also have 1GB or is it 768MB or 512MB?

...

One last thing - when I stepped up to a nVidia 9800GT from a Radeon 3870, from a 6800GT (AGP card), I noticed my air-to-air kills jumped up each time along with my framerate. I've had a sneaking suspicion that automatic weapons fire hits go up with increased framerate - and that sorta confirms my suspicion that having a crappy framerate just allows the enemy to fly through my bullets that do not render in-between frames. Seems to me that K/D and fighter prowess can be bought to a fair degree in this game, which goes against the grain with me since I started playing this game 10 years ago because the idea of a level playing field: that standings here are a measure of player ability and not the result of artificial measures that result from "levelling" so common in other games.

I think I do fairly well with the equipment I have, but vanity compells me to want to do better. I am neither willing nor able to spend much on that endeavor, though. Does it really make much difference?

Edited by oyaji

Share this post


Link to post
Share on other sites

I've noticed the frame rate versus actual record hits connection also, not just in this game either because its very apparent when I play BFBC2 too.

I'd suggest trying to latest trial version of playclaw.

The Cpu cores setting in playclaw selects how many cores you want the program to use during compression, I haven't alternatively toyed with the core affinity in Task Manager.

My current Playclaw settings are:

Compression/low

Frame size/Full

Frame Rate/30

Cores/2

Frame size/Half generates low quality videos, but the FPS hit during play is nil.

For some reason Frame Rate/60 causes stutters in the recorded video, 30 results in the smoothest video's for me.

Side note on the compression settings, high compression uses a special codec m-jpeg, which required some scouring on the net to find.

Share this post


Link to post
Share on other sites
I've noticed the frame rate versus actual record hits connection also, not just in this game either because its very apparent when I play BFBC2 too.

I'd suggest trying to latest trial version of playclaw.

The Cpu cores setting in playclaw selects how many cores you want the program to use during compression, I haven't alternatively toyed with the core affinity in Task Manager.

My current Playclaw settings are:

Compression/low

Frame size/Full

Frame Rate/30

Cores/2

Frame size/Half generates low quality videos, but the FPS hit during play is nil.

For some reason Frame Rate/60 causes stutters in the recorded video, 30 results in the smoothest video's for me.

Side note on the compression settings, high compression uses a special codec m-jpeg, which required some scouring on the net to find.

Sounds familiar to me too... I'm pretty sure I was trying to record at 30FPS and couldn't do any better than 11.

I am also pretty sure I went for either low or no compression. That costs more in storage space for less processor overhead, unless I am mistaken. (Besides not wanting to waste processor cycles in compressing video while recording in real time, I would always prefer to have raw video at full resolution and compress it later if I need to - as a matter of quality).

My connection isn't bad these days - it improved after the local telephone company upgraded their equipment after all the hurricanes we've had over the past 5 or so years: I usually ping around 50-60 with 5MB/sec download and 1MB/sec upload capability to Dallas according to SpeedTest (although I have seen ping time double and data transfer numbers drop to half on occasion). Packet loss is usually a non-issue.

I still think that if I could get my combat framerate up higher that I'd then be able to record at faster framerates. Doesn't framerate always drop when you do HDD recording?

Edited by oyaji

Share this post


Link to post
Share on other sites

Question for CRS:

When looking at video card reviews and the list of games that are generally used as benchmarks, which game should we look at as the closest marker to BGE? Generally speaking, I know it wouldn't be possible to have an absolute direct comparable.

Edited by mako26

Share this post


Link to post
Share on other sites
Wanna hear sumpin' funny? The 9800GT texture fill rate is about 34 GT/sec and the 460 is around 37 GT/sec - nearly identical! And the old 9800GTX+ pumps out 47GT/sec - more than the best of the 400-series! Look here: http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units

One fella told me that the DDR5 RAM makes a big difference over the older DDR3 on board the 9800GT. But performance numbers are listed as nearly the same between my card and the 460. So which do I believe?

...

and this is where you realize fillrate means almost nothing these days. its all shader count and shader speed. gpus have been able to fill 60 frames at 1920x1200 for many years now but that doesnt mean anything if you can't run the shader operations at only 20fps.

fill rate = dinosaur performance metric.

ddr5 does make a fairly significant difference as it is quad data modules. two reads/writes at the rising and falling edge of the clock cycle. hopefully similar memory will find its way to the system bus soon. usually GPUs have been a step or two ahead on all the major ram breakthroughs.

Share this post


Link to post
Share on other sites

I still can use a NV 7900 GTX and with a single core AMD 64 and trees don't seem to do that much for me to notice...this at 1600x1200 which is probably close to 1080p pixels amount. I think the last few patches actually improved performance...this is a minimum system and it lags some at times but it's basically playable.

highly unlikely, post a screen shot from offline. log in as brit rifle run N/NE till you pass the rifle range. there is a forest a few hundred meters beyond that. zoom in and take a screen shot.

I'm pretty sure it's an video card architecture or driver issue with trees...game doesn't actually seem that graphics limited except for the usual problems of when u run high resolutions your pushing alot more pixels around.

i prove otherwise here

http://forums.battlegroundeurope.com/showthread.php?t=356181

100% performance upgrade going from a 9600gt to a 460gtx se. nothing else changed. same driver, same cpu, same ram, same resolution. what improved was a massive increase in shader processing power.

you'll also notice my 'normal' performance didn't increase much. reason fr that is my cpu power didnt increase which is what drives 'world' fps in ww2ol.

getting a black edition amd processor and a 460se sounds penny wise and pound foolish...your not going to get a long run out of those at all.

460 is/will be the bottom end soon...your better off with a 560 or better or a 6870/6950...at least your going to have all those millions of 5770 and 460 video cards in 'low end enthusiast' systems for game coders to try and make happy/playable and with your faster card you will end up with more enjoyment and longer till your next FORCED upgrade so you get more value.

youre assuming he wants a full system refresh, i'm not. the upgrades i suggest will ahve him gaming for another 2 years easily. btw the 460gtx se overclocks like crazy and according to benchmarks runs all current dx11 and other current games at playable framerates at 1920x1200 resolution.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/38178-nvidia-geforce-gtx-460-se-1gb-review-16.html

its a good card and will last long enough.

Share this post


Link to post
Share on other sites
Question for CRS:

When looking at video card reviews and the list of games that are generally used as benchmarks, which game should we look at as the closest marker to BGE? Generally speaking, I know it wouldn't be possible to have an absolute direct comparable.

none of them are comparable.

Share this post


Link to post
Share on other sites

100% performance upgrade going from a 9600gt to a 460gtx se. nothing else changed. same driver, same cpu, same ram, same resolution. what improved was a massive increase in shader processing power.

.

9600GT was/is a pos :P

Share this post


Link to post
Share on other sites
Doc' date=' occasionally Video Card discussions come up, what one or brand is the best to get the optimal performance. You have never committed to a specific manufacture, from what I remember although you have said (at that time) nvidia had better fill rate.[/size']

My question, is it Texture or Pixel, or both filtrates one should look at when deciding? Does frame buffer play into it at all? Your help is much appreciated.

nobody really quotes pixel fill rate anymore.

take nvidia's top card as an example

49.4 billion textures per second

to get the pixel fillrate you use the following coremhz * rops = pixel fill rate.

the gtx580 has 48 rops and has a mhz of 772 so 772 * 48 = 37.05 billion pixels per second.

even that isn't that important though as not much is constrained by raw fillrate these days. in ww2ol the only thing GPU limited appears to be fps when zoomed into trees which is heavily shader limited.

apart from the trees ww2ol is still heavily reliant on the cpu. you can test you cpu performance by typing in .benchremagen in the offline client prior to spawning in. to test your shader performance i suggest going offline and using the forest that is NE of the rifle firing range.

the nvidia 260 and ati 4xxx line of cards are really all this game 'requires' for good zoomed in to trees performance. ~3ghz AMD/core2 or ~2.6ghz corei5/7 gives you good enough fps everywhere else. more obviously is better.

yeah it's all about the shaders these days, not like the old days

I still favour nVidia myself, but there are pro's and cons to any choice

I'll try and get a bit of a run down presented as an article on our front page after beta is all done with

But WWIIOL does not take advantage of multiple GPUs, does it? But will WWIIOL do so in the future? And if so, what is the approximate timeline for that?

Based on texture fillrate numbers, ATI trounces nVidia currently. ATI's best single GPU texture fillrate is their HD 6970 (MSRP $369) at 84.5 GT/s while nVidia's best is the GTX 560Ti (MSRP $250) at 52.6 GT/s. The price difference there is quite large, but compare the HD 6950 (MSRP $259) with a whopping 70.4 GT/s fillrate.

The texture fillrate advantage of ATI is currently enormous.

but ATI has a horrendous history of piss poor driver support, and nVidia noticably better

like I said, all choices have pros and cons

the video card and the cpu are slow in your case. that vid card is going to seriously choke on the trees in game.

an upgrade to like a 460gtx se and one of the black edition amd cpus will be more than enough though.

Wanna hear sumpin' funny? The 9800GT texture fill rate is about 34 GT/sec and the 460 is around 37 GT/sec - nearly identical! And the old 9800GTX+ pumps out 47GT/sec - more than the best of the 400-series! Look here: http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units

One fella told me that the DDR5 RAM makes a big difference over the older DDR3 on board the 9800GT. But performance numbers are listed as nearly the same between my card and the 460. So which do I believe?

and this is where you realize fillrate means almost nothing these days. its all shader count and shader speed. gpus have been able to fill 60 frames at 1920x1200 for many years now but that doesnt mean anything if you can't run the shader operations at only 20fps.

fill rate = dinosaur performance metric.

ddr5 does make a fairly significant difference as it is quad data modules. two reads/writes at the rising and falling edge of the clock cycle. hopefully similar memory will find its way to the system bus soon. usually GPUs have been a step or two ahead on all the major ram breakthroughs.

If you read back through the whole discussion posted above your reply, you'll see that the whole discussion has been zeroed in on one specific, texture fillrate, and that it seems to imply that the performance of cards with regard to "shaders" (whatever that means) is determined by texture fillrate.

Is your reply on target or is it a non-sequitor? Is texture fillrate the metric that determines vid-card performance for WWIIOL or not?

...

Edited by oyaji

Share this post


Link to post
Share on other sites
If you read back through the whole discussion posted above your reply' date=' you'll see that the whole discussion has been zeroed in on one specific, [b']texture fillrate, and that it seems to imply that the performance of cards with regard to "shaders" (whatever that means) is determined by texture fillrate.

Is your reply on target or is it a non-sequitor? Is texture fillrate the metric that determines vid-card performance for WWIIOL or not?

...

youre getting confused. brief history lesson.

first came fillrate which is how many raw pixels can you draw per second with an assumed texture map per pixel draw. meaning it is always assumed if you can draw one pixel then you can also texture it.

then came double texture. pixel fillrate was the same but now you could apply two textures to each pixel.

the above is sometimes called the fixed function pipeline and was the cats *** for awhile. nvidia ended 3dfx because it won the fill rate war.

then shaders came out (hardware texture and lighting was sort of a precursos here). the first shaders were fairly static pixel shaders. then vertex shaders followed shortly there after, then finally geometry shaders and when those arrived shaders were completely split off the fixed function pipeline and grew into what we have now with GP-GPU programming where these shader pipelines are extremely fast/parallel processing units that do all kinds of crap.

shaders allow you to do lots of other things to pixels and textures. things like wind effects on what would otherwise be fixed pixels. shaders make all that crap move. hair effects, etc, etc.

to bring this back to ww2ol the fixed function pipeline is of very little relevance to over all performance as the engine is extremely reliant on raw CPU cycles because the original client engine was written before even the old school hardware geometry offloading was standardized. which is why the cpu is such a bottleneck as it and only it is responsible for everything.

examples of this are basically anything and everything that isn't a tree, a waving flag, or smoke. its possible they're using shaders for bump mapping too idk. everyone with a current gen AMD or intel gets (should be) good performance in the regular world.

zoomed into trees though is heavily shader limited. so if you're having poor performance when looking at trees then your bottleneck is your shader processing power. if youre having a bottlneck everywhere else then your CPU bottlenecked.

so lets recap.

fill rate be it pixel or texture means almost nothing these days.

shaders and shader performance is extremely important.

nvidia likes to call them "cuda cores" and ati calls them "stream processors". ATI always has more of these than nvidia but dont let numbers fool you as the number of cuda cores or stream processors is not a 1:1 metric. having more doesn't mean faster as the designs differ significantly and are well beyond the scope of this post as i don't really understand the designs myself because its just a waste of my time to read about it. just read reviews and see how they perform and make your buying decision off that.

Edited by madrebel

Share this post


Link to post
Share on other sites
highly unlikely, post a screen shot from offline. log in as brit rifle run N/NE till you pass the rifle range. there is a forest a few hundred meters beyond that. zoom in and take a screen shot.

i prove otherwise here

.benchremagen is 36fps to 44fps with 38-40 being most common...1600x1200 32 bit with half of visual effects enabled.

spawned in as brit rifle FPS is low 30's to mid 80's in offline

standing on the top of the vehicle spawn middle of roof closest to the barraks I get around 50 fps looking N with horizen centered in screen, looking E is a little higher around 55fps avg, looking South around 75 fps avg, looking west around low 60's avg.

zoomed in with binocs I'll get as low as 9 or 10 fps looking at that forest

I have 3 gig's ram(and it's DDR400...it's not ddr2/ddr3)...in online mode game is still playable for me and some the patches have improved performance. the binocs issue while annoying doesn't stop me from using them often while playing...

game still allows very low systems to play it...especially when u consider this is on a 1600x1200 video. most low end systems would run lower graphics resolution monitors.

Edited by Vampress

Share this post


Link to post
Share on other sites

zoomed in with binocs I'll get as low as 9 or 10 fps looking at that forest

this is what i was talking about. 9fps isn't playable and that old card is the limitation. if you swapped that thing to say a 5670 or a gts250 the game would be playable with a few slow downs here and there.

Share this post


Link to post
Share on other sites
this is what i was talking about. 9fps isn't playable and that old card is the limitation. if you swapped that thing to say a 5670 or a gts250 the game would be playable with a few slow downs here and there.

?

who plays looking through their binocs?

Your standing still not moving using them...low fps while annoying really doesn't affect the functionality of the binocs...10 fps is fast enough to be able to scan with(and it's usually alot higher then that...that's a min)...if it were something 1 or 2 fps it would be unusuable

the card is like 4 generations old already...the 8800's, the 200's, the 400's, and now the 500's have long passed it by...yet it can still play the game.

Of course it would be faster with a new video card...

Edited by Vampress

Share this post


Link to post
Share on other sites
?

who plays looking through their binocs?

Your standing still not moving using them...low fps while annoying really doesn't affect the functionality of the binocs...10 fps is fast enough to be able to scan with(and it's usually alot higher then that...that's a min)...if it were something 1 or 2 fps it would be unusuable

ever try aiming at and hitting a tank running along the edge of a forest at 9fps?

the card is like 4 generations old already...the 8800's, the 200's, the 400's, and now the 500's have long passed it by...yet it can still play the game.

sorry but it cant 'play' the game. 9-10 fps is not playable.

Share this post


Link to post
Share on other sites

AMD have best Power/Performance RS right now, "6870" will be my next one. "560 Ti" nice one too, similar performance but 20W more.

Share this post


Link to post
Share on other sites

5kkfGR_n3D0

This sh*t is nuts. I love eyefinity. Hard to record with multiple monitors but damn is it sweet.

Share this post


Link to post
Share on other sites
Sign in to follow this  
Followers 0

  • Recently Browsing   0 members

    No registered users viewing this page.