Sign in to follow this  
Followers 0
vanduzer

Doc, Fillrate question?

48 posts in this topic

Doc, occasionally Video Card discussions come up, what one or brand is the best to get the optimal performance. You have never committed to a specific manufacture, from what I remember although you have said (at that time) nvidia had better fill rate.

My question, is it Texture or Pixel, or both filtrates one should look at when deciding? Does frame buffer play into it at all? Your help is much appreciated.

Share this post


Link to post
Share on other sites

nobody really quotes pixel fill rate anymore.

take nvidia's top card as an example

49.4 billion textures per second

to get the pixel fillrate you use the following coremhz * rops = pixel fill rate.

the gtx580 has 48 rops and has a mhz of 772 so 772 * 48 = 37.05 billion pixels per second.

even that isn't that important though as not much is constrained by raw fillrate these days. in ww2ol the only thing GPU limited appears to be fps when zoomed into trees which is heavily shader limited.

apart from the trees ww2ol is still heavily reliant on the cpu. you can test you cpu performance by typing in .benchremagen in the offline client prior to spawning in. to test your shader performance i suggest going offline and using the forest that is NE of the rifle firing range.

the nvidia 260 and ati 4xxx line of cards are really all this game 'requires' for good zoomed in to trees performance. ~3ghz AMD/core2 or ~2.6ghz corei5/7 gives you good enough fps everywhere else. more obviously is better.

Share this post


Link to post
Share on other sites

ATI offer the best power/price ratio.

Nvidia if you want to set you house of fire.

Wrong forum btw.

Share this post


Link to post
Share on other sites
ATI offer the best power/price ratio.

Nvidia if you want to set you house of fire.

Wrong forum btw.

depends on the price range. nvidia has a slight edge at most of the important price points right now.

Share this post


Link to post
Share on other sites

yeah it's all about the shaders these days, not like the old days

I still favour nVidia myself, but there are pro's and cons to any choice

I'll try and get a bit of a run down presented as an article on our front page after beta is all done with

Share this post


Link to post
Share on other sites

awesome thread!!! read this on the way to reims and want to hear more. oh yeah,, nvidia rules the universe!

Edited by hot74rod

Share this post


Link to post
Share on other sites

...

But WWIIOL does not take advantage of multiple GPUs, does it? But will WWIIOL do so in the future? And if so, what is the approximate timeline for that?

Based on texture fillrate numbers, ATI trounces nVidia currently. ATI's best single GPU texture fillrate is their HD 6970 (MSRP $369) at 84.5 GT/s while nVidia's best is the GTX 560Ti (MSRP $250) at 52.6 GT/s. The price difference there is quite large, but compare the HD 6950 (MSRP $259) with a whopping 70.4 GT/s fillrate.

The texture fillrate advantage of ATI is currently enormous.

...

Edited by oyaji

Share this post


Link to post
Share on other sites

but ATI has a horrendous history of piss poor driver support, and nVidia noticably better

like I said, all choices have pros and cons

Share this post


Link to post
Share on other sites

Are you in any position to provide insight into your team's direction regarding multiple gpu support DOC?

Share this post


Link to post
Share on other sites
Are you in any position to provide insight into your team's direction regarding multiple gpu support DOC?

ED1CZxLR38E

Share this post


Link to post
Share on other sites

Eyefinity. Just sayin. As soon as I get my other two monitors in I will be in heaven.

Share this post


Link to post
Share on other sites

But WWIIOL does not take advantage of multiple GPUs, does it? But will WWIIOL do so in the future?

^^^^^^^^^this^^^^^^^^^^^

Share this post


Link to post
Share on other sites

I need to buy a new computer and a new video card

Which one should it be?

Share this post


Link to post
Share on other sites
the nvidia 260 and ati 4xxx line of cards are really all this game 'requires' for good zoomed in to trees performance. ~3ghz AMD/core2 or ~2.6ghz corei5/7 gives you good enough fps everywhere else. more obviously is better.

Sometimes i think WWIIOL can't eat more than a simple Nvidia 260.

When i set my Nvidia 280 to work on PCI-E X4 or X16 i don't see any difference on my fps and the card stay cold.

I get a boost of around 2.5 on World of tank and around 2.2 on the last Need for speed.

CPU > RAM > GPU all the way on WWIIOL.

Share this post


Link to post
Share on other sites
Sometimes i think WWIIOL can't eat more than a simple Nvidia 260.

When i set my Nvidia 280 to work on PCI-E X4 or X16 i don't see any difference on my fps and the card stay cold.

I get a boost of around 2.5 on World of tank and around 2.2 on the last Need for speed.

CPU > RAM > GPU all the way on WWIIOL.

That is a really good point. I'd like to get a better framerate in the game but in air combat I am getting 40-60FPS (on minimum settings)

For me the burning question is "what do I need to get good enough performance to record video with PlayClaw or Fraps?" I've been wanting to do so for years (in particular for instructional game vids), yet with this computer I've fallen flat:

  • AMD 925 Phenom II x4 quad-core processor (2.8GHz)

  • 4GB DDR3 1333 RAM

  • nVidia 9800 GT (512Mb) graphics card

  • M4A785TD-M EVO motherboard

  • 300GB PATA HDD for system and applications plus 1TB RAID0 (2x500Gb drives) on SATAII (3Gb/sec) interface

I get just 9 to 11 FPS recording video. Where is the weak link - the vid card? Would I be better off just buying a cheap DDR5 card, or do I need to wait and buy a HD 6950? Whatever card I get needs to last me a couple years at least...

...

Share this post


Link to post
Share on other sites

the video card and the cpu are slow in your case. that vid card is going to seriously choke on the trees in game.

an upgrade to like a 460gtx se and one of the black edition amd cpus will be more than enough though.

Share this post


Link to post
Share on other sites
the video card and the cpu are slow in your case. that vid card is going to seriously choke on the trees in game.

an upgrade to like a 460gtx se and one of the black edition amd cpus will be more than enough though.

Wanna hear sumpin' funny? The 9800GT texture fill rate is about 34 GT/sec and the 460 is around 37 GT/sec - nearly identical! And the old 9800GTX+ pumps out 47GT/sec - more than the best of the 400-series! Look here: http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units

One fella told me that the DDR5 RAM makes a big difference over the older DDR3 on board the 9800GT. But performance numbers are listed as nearly the same between my card and the 460. So which do I believe?

...

Share this post


Link to post
Share on other sites
the video card and the cpu are slow in your case. that vid card is going to seriously choke on the trees in game.

an upgrade to like a 460gtx se and one of the black edition amd cpus will be more than enough though.

I don't know that I buy that...

I still can use a NV 7900 GTX and with a single core AMD 64 and trees don't seem to do that much for me to notice...this at 1600x1200 which is probably close to 1080p pixels amount. I think the last few patches actually improved performance...this is a minimum system and it lags some at times but it's basically playable.

I'm pretty sure it's an video card architecture or driver issue with trees...game doesn't actually seem that graphics limited except for the usual problems of when u run high resolutions your pushing alot more pixels around.

I DO NOT LIKE bottom end upgrades:

getting a black edition amd processor and a 460se sounds penny wise and pound foolish...your not going to get a long run out of those at all.

460 is/will be the bottom end soon...your better off with a 560 or better or a 6870/6950...at least your going to have all those millions of 5770 and 460 video cards in 'low end enthusiast' systems for game coders to try and make happy/playable and with your faster card you will end up with more enjoyment and longer till your next FORCED upgrade so you get more value.

I'd probably suffer through with your processor and just upgrade the M/B to a i5/i7 or bulldozer(summer...maybe)...your processor is slow though and probably your weakest link in your system.

Edited by Vampress

Share this post


Link to post
Share on other sites

Single monitor? Go Nvidia.

Multi monitor? Go ATI.

and I've really got to update that picture... looks a lot better these days.

Share this post


Link to post
Share on other sites
That is a really good point. I'd like to get a better framerate in the game but in air combat I am getting 40-60FPS (on minimum settings)

For me the burning question is "what do I need to get good enough performance to record video with PlayClaw or Fraps?" I've been wanting to do so for years (in particular for instructional game vids), yet with this computer I've fallen flat:

  • AMD 925 Phenom II x4 quad-core processor (2.8GHz)

  • 4GB DDR3 1333 RAM

  • nVidia 9800 GT (512Mb) graphics card

  • M4A785TD-M EVO motherboard

  • 300GB PATA HDD for system and applications plus 1TB RAID0 (2x500Gb drives) on SATAII (3Gb/sec) interface

I get just 9 to 11 FPS recording video. Where is the weak link - the vid card? Would I be better off just buying a cheap DDR5 card, or do I need to wait and buy a HD 6950? Whatever card I get needs to last me a couple years at least...

...

What resolution?

Not sure why Playclaw would be killing it that bad considering your system specs. Either there is a driver issue or there's a hardware bottleneck at the vidcard.

I've a budget system:

i5 750 intel

2gig ddr3 1333

Biostar T5xe mobo

ATI 4860

single 300gig sata drive

1680x1050 with most settings on high I can still run playclaw and hold over 30 fps around contested towns.

Share this post


Link to post
Share on other sites
but ATI has a horrendous history of piss poor driver support' date=' and nVidia noticably better[/quote']

For the reader, I have to take issue with Doc here. ATI 'had' a stretch of bad drivers in the late 90s, early 2000s. Since then, they've been fine. Neither has been excellent.

It was only a year ago that Nvidia had to recall their own driver after release. And in the last couple of years, Nvidia users who play BE have at times found the need to down version their drivers (install older ones) to play the game - at least until another driver update has come out.

In my opinion, since both manufacturers have attempted to make monthly official driver updates (they don't always make it), they've had more issues with newer hardware, especially when Windows 7 released, and with the surprising growth amongst gamers in its 64-bit version.

Read the readmes of the driver releases from either company, if you can stay awake, and you'll come across long lists of games and other applications with various problems with each release, though often, they state the problems are the developers, i.e., they've tested and concluded their driver and hardware is not the cause.

Wanna hear sumpin' funny? The 9800GT texture fill rate is about 34 GT/sec and the 460 is around 37 GT/sec - nearly identical! And the old 9800GTX+ pumps out 47GT/sec - more than the best of the 400-series!

Unfortunately, the series number, especially the first digit in the 4-number series designations, is less and less meaningful.

I recommend always checking Toms Hardware.

Looks for the most recent "Best Graphics Cards For the Money" article (they do one each month).

http://www.tomshardware.com/reviews/best-gaming-graphics-card-geforce-gtx-590-radeon-hd-6990,2879.html

Check the last page of those articles, which is the Graphics Card Hierarchy Chart

http://www.tomshardware.com/reviews/best-gaming-graphics-card-geforce-gtx-590-radeon-hd-6990,2879-7.html

Share this post


Link to post
Share on other sites
For the reader' date=' I have to take issue with Doc here. [/quote']

BLOO gets the 2011 "BALLS OF STEEL AWARD"

Lol your farked!

But good wrap BLOO, an objective answer. Although I am mainly an Nvidia guy, I have had ATI in the past (12 months back) and had a crap time. Bloody drivers that shipped with the card wouldn't work. Had to put old card in and download some.

But I hear they are getting better.

Yes NVid have done some strange things in the last 12 months too, crack must be going cheap lately.

But I just put the lastest beta drivers on my GTX470's SLI'd and Nvidia MOBO and they really really seem sweet. I did notice a difference athough my machine is a little overkill as it is.

Edited by klemzig

Share this post


Link to post
Share on other sites
Sign in to follow this  
Followers 0

  • Recently Browsing   0 members

    No registered users viewing this page.