piercer

Registered Users
  • Content count

    548
  • Joined

  • Last visited

Everything posted by piercer

  1. I run dual-core E6600 @ 3.5GHz w/ 1920x1200 on an 8800GTX -- I get on average 60+ FPS with occasional drops into 30-40 range during heavy fighting. That said, it's the 3.5GHz that makes the big difference in the game, WWIIOL is very CPU intensive, the 8800GTX is nice and gives me the 1920x1200 and I can use AA without much any impact to FPS, but it's really the CPU that is driving the performance -- that and the fact that I'm using an X-Fi to offload all the audio from the CPU. If the rats ever optimize for multi-core (hopefully eventually) then having dual-core, quad-core, etc.. will make a bigger impact, today though it's really bound to the main core the game runs on -- Reality is that multi-core is going to be were CPU performance comes from as we move forward so I'm hoping these optimizations come soon, chip dies are hitting some limitations that are going to protract the ability to get single core performance pushed forward any faster in the nearer term. P- P-
  2. I posted this on the Barracks because it was greatly effecting my inf play, but it should be here as well: http://forums.battlegroundeurope.com/showthread.php?t=183312 P-
  3. I've been overclocking my systems pretty much since manufacturers started including BIOS that enabled you to do it. I overclock my GFX board as well. Overclocking takes time, education, and patience. There are some great sites out there, especially: http://www.overclock.net/ -- Pretty much anyone who overclocks has used that site and references it for tweaking their machines. I'd recommend reading as much as possible, don't just jump into your BIOS and start changing settings that's a sure way of creating an unstable system, lots of frustration, and the uneeded fear of feeling you might of fried your system. which can happen. Did I mention patience? You'll have to remember that not all chips are created equal. Some folks can hit real high overclocks at descent temperatures, while others hit a wall at only 10-20% overclock on the same chips. There are a lot of things to consider, memory timings, voltages, temperature, northbridge/southbridge frequency, PCI locking, etc.. all have impact on overall system stability. Once you have a stable overclock and it's been tested to run memtest and Orthos for 24hours+ stable then you're 'pretty much' good to go, though it's not unheard of to have occasional instable blips even after all tests give you green lights, that's just the nature of it all. For me: Originally: An E6600 dual-core, which was 2.4GHz stock @ 1.3v (voltage), idle @ 26C on stock cooling and Corsair memory at 4-4-4-12 1t @ 800MHz (DDR2) Now: E6600 @ 3.5GHz (Stable now for 3 months), 1.425v, idle at 35C using Swiftech water cooling, memory 4-4-4-12 2t @ 1200Mhz (DDR2). This also included voltage increases on most of the motherboard settings to aid in system stability. My system hits a wall right around 3.6GHz that requires a large increase in CPU voltage 1.425 -> 1.525 and is very difficult to get stable, as well the temperatures start going through the roof -- 1.525 @ 3.6Ghz is 43C on water cooling. I've decided that 1.425 is a good voltage and 3.5 is stable so I'm sticking with it for now. P-
  4. It's a bug that's been around the last few versions. It has something to do with the SSE2 code. If you think you're going to fly unselect the 'Run SSE2 Executable' option under the settings app in the CRS directory. P-
  5. Yea, on my 8800GTX dual core E6600, it went from 15fps to 60fps by turning off threaded -- it requires dual core to have it make a difference. P-
  6. A couple of comments reading through this thread. A) Artifacting is not a good thing, especially if you haven't touched the card for overclocking. NVIDIA drivers have a few issues in them still, but I'm not sure they would cause some of the issues you are talking about. That artifact you posted in the picture above I've had when in Oblivion, but after a few driver releases it went away. C) Make sure you have under the NVIDIA control Panel -> Manage 3D Settings have turned off 'Threaded Optimization' -- this is an FPS killer a known issue with video cards and can take a 50fps system down to a 10fps system. D) I agree that power supply may be your best bet, without having full system specs it's hard to say, but a 7900 isn't exactly a PSU hog, an 8800GTX that's a different matter. E) It could very well be a bad board. If you're still having problems after the PSU upgrade, download ATITOOL and run the 3D artifact test, it's a fuzzy cube. If it's running and you start seeing yellow dots everywhere, then I'd recommend an RMA as that's usually a sure indication of bad GPU or RAM on the video board. Lastly, your temps sound in the high range, and that can be a big contributor to artifacting. Sometimes when these boards are manufactured, the manufacturing process puts too much thermal paste on the GPU creating an insulating effect rather than a heat transfer. The Zalman cooler should fix that. P-
  7. Nope, not a Beta game, matter of fact what you are experiencing is more the reality than most other games of this genre. What you are experiencing is prop torque and propwash. Two frustrating elements of propellar based aircraft. Most games don't incorporate this into the 'flight model' as the goal is to keep people who don't have the patience from throwing away the game. What's that? You tried flying a Spitfire and crashed it...HOOORRAAYYY (that's axis vs. allied sarcasm). But kidding aside. My advice in playing this game is to practice flying offline and for the first little while of playing stick to infantry, tanks, atg's, etc... the air war is another game all onto itself. I'd advise you to do a lot of reading in the forum titled "The Hangar" especially the stickies, as they have a lot of good content to help you understand the game of flight. Also, start flying with the bombers as they aren't as hard to take off in, though they move slower, they'll help you get the base on how to fly. As they say "If you find taking off frustrating, wait until you have to land" or land with a damaged airplane. Also, there was a time when 'locking' the tail wheel was not the default state. Was always fun to watch new flyers spin around on the airfield like tops. Oh, the good ole days :-D P-
  8. K6 -- Nope, I have P21. The 680i audio issue with X-Fi is a completely different beast with the 8800GTX boad. It has something to do with the EVGA timings on the PCI bus -- or so it appears. A timing issue creates a different set of sounds. Like a screeching static sound -- I get those as well, but I'm keen to know the difference. The problem that headkvist is talking about is distinctly a 'in flight' audio bug. Not sure why X-Fi gets it messed up -- I thought I remember reading somewhere that WWIIOL had some special code for the X-Fi embedded in the system.... P-
  9. Hmmm....I'm having this same problem. BUT, I have an insight on this. Up until 3 weeks ago I had an A8N32SLI Deluxe mobo w/ 4200X2 AMD along with an X-Fi board. I never had any audio issues with the X-Fi. Fast forward: I now have a EVGA 680i motherboard with an Intel Core2 Duo E6600. Popped the X-Fi in there and I've had sound issues up the.... anyhow, same problem... on the ground it's fine, in the air, fades in and out, sometimes dies completely. So, it might be something with the Core2 or the supporting nvidia chipset for the Core2. P-
  10. All these problems are addressable. I think you're #1 problem is due to your firewall. Read the stickies at the top of this forum I believe it's addressed in one of those. Just a comment on flying. Don't use flying as your way of developing an opinion about the game. Flying by far is IMHO the hardest thing to learn in this game, almost everything else is much simpler to learn (simple to learn doesn't necessarily mean you die less). The flying models are as others mentioned made to be very realistic, propwash, engine torque, overheating your engine, etc... are all circumstances of pre-jet war aircraft. Once you master getting your plane off the ground there are about 100 more variables to deal with once you get into combat. It's the most frustrating experience, yet the most satisfying when you hit an enemy just right and see his engine ignite or wing fly off. P-
  11. E6600 is very overclockable, sure you can't 'promise' or 'gurantee' it, but there are hundreds, if not thousands of threads out there about folks clocking up to 3GHz, on air, with zero voltage increase. Lets be realistic about overclocking in general, CPU manufacturers select the Ghz speed based on 'yields' that is, they test their first production chips and overclock them, increase voltages, again and again, to see what the yield percentage is of higher clock rates. So some E6600 under extereme circumstances can hit over 4GHz, some, maybe not beyond 3.2GHz. It's no different than how LCD manufacturers determine their glass sizes and costs based on dead pixel yields. The one thing about the Core 2 Duo's is that right out of the chute folks were easily clocking them above 3.2Ghz, which in itself is odd, because Intel could've come out with higher Ghz chips and really stomped all over AMD even more. Why though did they select sub 3GHz levels, if so many people are clocking above 3Ghz? Nobody know for sure except the team at Intel -- the only thing I can figure why they did this is that a fully off the shelf computer isn't just a CPU, it's the HTT, FSB, Memory timings, PSU, bus speeds, etc... If Intel had come out with 3.0GHz+ chips from the get go, the low-end price point of the 'complete' system would have been to costly, especially wigth memory chips in the 800Mhz+ range when you sart figuring in latencies, available mobo's etc... I'm thinking they started with sub-3.0Ghz chips to keep aligned with memory clock frequencies and packaged power-supplies on high-volume off the shelf systems (DELL et al). Someone earlier posted they were having a hard time getting their E6600 up to 3.0Ghz, this is most likely because your memory divider needs to be increased, not because the CPU can't do it, which illustrates the problem. P-
  12. I have always bought AMD chips, that is, up until 2 weeks ago. Look anywhere on the net at performance benchmarks, by far, the Core2 Duo whoops on AMD left and right. Sure, Intel changes sockets like dirty underwear, but overtime AMD and Intel have changed sockets equally as much, just in the last couple years Intel has been doing it more often. So, all things being the same. My 4200+ overclocked to 2.7GHz with my new 8800GTX board, in a BUSY town, would give me between 25 - 40 fps. My new E6600 at stock 2.4GHz (This thing will clock up over 3.0Ghz, no problem, and people have taken them as high as 3.8Ghz -- on air) with my 8800GTX board now gives me ~40 - 70 in a BUSY town. The low band discrepency is most likely that they werent the same cities so you can't do an exact comparison. Looking at AMD's Roadmap, I think you're good with a Core2 for the near term. P-
  13. Here is a Screen Shot I took tonight: This is running full settings. What it's showing is now I'm CPU bound HAHAHAHA....Guess it's time to get a Conroe!! This is at 16AF/16AA @ 1920x1200x32 full settings. AND!! With the 96.89 Drivers that came with the new board NO WATER SHADER DELAYS.... P-
  14. Yea, I'm an upgrade addict as well, I think I need to start a 12 step UA program in my neighborhood :-D I've got the itch to go and buy the new 680i w/ 2GB and a Kentsfield -- somebody stop me!!!!!!!!!!!!!!!!!!!!!!!!!!!!! It's not my fault though (DENIAL) -- It's the Rats fault for nerfing my FPS, I hear if you run wwii on a Cray SMP system some people actually achieve 50+ FPS out in the middle of nowhere :-P So far in the last 6 months: New Mobo New Case New GFX Card (2nd) New PSU New FlightStick New Throttle And I have no place to put the old stuff, so it sits in my garage -- Need to ebay a few things :-O Yikess....and I thought my wife had a shoe problem. P-
  15. Couple things -- * Almost all benchmarks show differences in overall performance using different CPUs with stock 8800GTX's -- so there are some issues with the CPU not being able to keep up with the card. BUT, these are all folks doing 3DMark06, using the default settings (1280x1024) -- What really needs to be shown is a very graphics intensive game, like FEAR running full settings on different CPUs -- then we're talking more real-world performance. Or, people need to post 1600x1200 3DMark06 scores. Last night I did a benchmark on FEAR all in game settings maxed with 16af 16aaQ Multisampling High-Quality -- Max: 163 / Min: 31 / Avg: 63 * Oblivion -- Looks awesome with HDR !!!!! * PSU -- Seems you can get by with a 500w PSU as long as you have enough Amps on the 12v rail. I have a Thermaltake 650w Toughpower with modular cabling. * My 3DMark06 Score w/ AMD4200 @ 2.8GHz = 9070 * There's some guy using expensive cooling that's overclocked both the 8800GTX and his Conroe (past 4GHz) and achieved a record 3DMark06 score of ~24,000 -- Now that's friggin amazing. P-
  16. HMmmmmmmmmmmmmmmmmmmmmm..... Are you saying that there's going to be real widescreen support?? Like with correct aspect and wider FOV?? OMG...That would be amazing. Or, are you saying you're going to fix the overlay layers to correct the aspect ratio? P-
  17. It's showing the DDR speed for everyone (2x Mhz -- 500 = 1000Mhz). It was a change they did, I bet they roll it back as it's confusing the heck out of people. You'll see that a lot of people say things like "I overclocked to 650/1500" -- They didn't take the Mem clock to 1500, they took it to 750. I think it was dumb on Nvidia's behalf to do this because uneducated overclockers are most likely going to get it wrong and wind up either underclocking their memory or RMA'ing their boards when they burn them up from pushing them past the legal limit. P-
  18. A) If you're looking at Antec, check out the 900 case as well. Thing is a cooling monster, dropped my case/cpu temps 8-10©. Memory, high probability that you'll be swapping in and out of paged memory at 2GB -- 4GB, depends on settings + keep in mind that XP has a 3.5GB top out and on some mobo's I've read they won't boot (dumb huh). I'd recommend checking around on that. Also, 2 sticks of RAM you'll most likely be able to keep a 1T command timing vs. having to jump to 2T. P-
  19. Broke the link and reposted it here: WARNING -- BIG AXE FILE :-D http://www.katmaicube.com/wwol/images/SShot12_dl.bmp OK -- I added these today all Max Settings: 16xQ(AA) / 16x Anisotropic / Multisampling AA on Transparency. Oh, and I saved them as PNG files. I was trying to show some of the HDR and other views. http://www.katmaicube.com/wwol/images/SShot13.png http://www.katmaicube.com/wwol/images/SShot14.png http://www.katmaicube.com/wwol/images/SShot15.png P-
  20. Hmmmm...that reminds me, I now have to figure out what to do with my now ex-7900GT overclocked PNY board. P-
  21. Yea, wasn't sure what format to post in that would stay lossless. Big file, but if you want to have an accurate capture that's as close as possible to what the screen is. I've never used PNG, is it a better format for posting? The image is hosted on my own website and I get Gigabytes of bandwidth so the file should be ok (I hope :-O ) Some more observations: My experience so far is that for HDR it seems to be much cleaner -- I think this is what someone asked earlier about the hazziness -- things are hazy, but it's not so hazy that it looks more like 'blooming' than haze. I've also noticed that the mipmaps seem to also be enahanced/cleaner than on the 7900GT board I yanked out -- At 16x AF I do notice a large quality difference. ANTI-ALIASING: I think the 16xAA works well, though I do notice some minor 'flaws' that hopefully will be addressed in driver revisions there's almost a different type of 'stepping' that gets introduced that happens at this level of antialiasing -- look at the french flag pole on the right vs. on the left, the angle difference kinda of shows this where the averaging creates a thicker dark line. But you can also see that in most cases 16AA removes jaggies everywhere. GAMMA: I need to adjust Gamma, even though its 8:30 it seems the contrasts between different shadow areas is too heavy. I found myself last night putting my face 6" from the screen to 'see' in some areas. FLICKERING: On my 7900GT with the older drivers it seemed that I got a lot more flickering on leaves in trees. Under the new drivers and board the flickering appears to be almost gone, I'll do some more checking tonight on this. BLOO & KFS1 -- There are some problems at 1920x1200 with text on the mission screens. Should I screen capture them and attach to a thread? Or do you want me to compress them and e-mail them to you? The text 'baselines' seem to be off. Not a deal-killer, just looks funky.
  22. I reduced the size because 1920x1200 is a big file ;-) Couple things: A) Water Shaders don't seem to be doing the delay, could be driver, could be the new shader architecture I'll need to pay attention to the haziness, these new boards handle image quality (IQ) much better than in the past they exceed the ATI IQ capabilities now. C) Because the game doesn't officially support 1920x1200 their is some stretching going on in the game when you are in widescreen mode. You get used to it pretty quick though. P-
  23. WOOOTT!!! It's Nov. 8th -- I'm going to pick my 8800GTX up today!!! Here's one of the first benchmark reviews of the GTS: http://www.tweaktown.com/articles/977/ The GTS burns everything out there, imagine what the GTX will do. I'll be able to finally run everything in 1920x1200!!!!!!!!!!!! P-
  24. I for one am buying the 8800GTX when it comes out. The GTX has been seen as low as $580 and as high as $800 -- A single GTX board outperforms the 7950 gx2 and you don't have to deal with the SLI profiles, that in itself is a good reason to get one. When the refresh starts, then I'll buy a 2nd one over eBay to have a high performance SLI system. As for DX10 -- It's optimized for DX10, but in current DX9 benchmarks it's outperforming by a substantial margin the 7950 and the 1900 from ATI. P-
  25. Yea, the GTO board is very enticing -- supposedly they're boards that didn't make the GTX cut due to initial part failures -- they were then re-furbed but sold as GTO instead of GTX. I've read articles of people pushing them beyond 700MHz -- very impressive for such a cheap board -- If I wasn't considering the 8800GTX's that come out next week, I'd probably try to locate a set of GTO's, overclock them and run in SLI. P-