sascha

WW2ONL and laptop graphics-cards

6 posts in this topic

First time I tried this game on my aging laptop, so here goes:

In the initial setup dialogue (before launching the game proper), I can only select my crappy on-board GFX chip (Intel HD Graphics 4000) and nothing else.

I do have a GeForce GTX 635M in this thing which is much more powerful than the Intel-chip, but it doesn't appear in the "video card"-selection menu.

EDIT: Just launched the game and it's the same story with the in-game video-settings: The only option that appears is the Intel chip ... not the GeForce.

How do I get the game to recognize and use the GeForce?

EDIT 2:

Sorry about the picture size.. don't have any decent Photoshop-esque program installed on this thing.

ww2onl%20temps.bmp?psid=1

Not sure how to interpret the GF's temperature reading, but the Intel Chip definitely seems to have been busy (shot taken right after exiting offline mode). Note the max value on the Intel's frequency. Also note how the Geforce's voltage hasn't peaked in any way.

Plus: You can't see it in the screenshot but "GPU-usage" for the GF is also at 0% across the board.

I'm also a bit worried about the high temperatures in general - and this happened while the laptop was sitting on a pretty efficient "cooling pad" with a huge external fan running underneath.

Oh well.. how can I make sure that it's the GF and not the on-board GFX doing the work? Is there a way to force the game to recognize the GeForce? Thing is that it usually comes on automatically when a game is being launched.

S.

Edited by sascha

Share this post


Link to post
Share on other sites

Guess I'll just answer my own post.. :D

NVidia Control Panel did the trick. Set "WW2.exe" to use Geforce graphics card (default was set to the onboard Intel-chip).

Gave me a nice FPS boost in offline mode (50-ish to 80-100).

Temperatures are still pretty scary. After 5 minutes or so, my CPU-max temp was nearly 100°C and the GPU wasn't far behind. According to Intel, 105°C is the thermal design-limit for the i5 in this laptop.

Was thinking of opening up the computer to check the fan for dirt and dust (after three years, there's probably a crap-load of it in there). But after having seen just how far you have to disassemble the laptop to get to the fan (YT-video), I scratched that idea...

God, how I miss my desktop PC with its 2nd gen Corsair H60 water-cooler... :(

S.

Edited by sascha

Share this post


Link to post
Share on other sites

Seems like the voltage going to the cpu is a little high. Try actually switching off turbo mode in the bios (if it allows you) and see what happens. It may drop your fps a little and it may not but it should lower your temps allowing you to play for longer than 5 minutes. You'll know after testing it.

Some laptops just get suicidally hot unfortunately. It seems like the processor in your laptop is one of them.

Edited by blipoop

Share this post


Link to post
Share on other sites

Yeah... err... sadly I've never been able to access the BIOS on this laptop. I don't even get the regular mainboard bootup-sequence like on any normal desktop-machine - just the ASUS logo and then the Windows bootup sequence.

I remember that I somehow managed to enter the BIOS a few years back (don't remember how) and that sort of screwed up the whole system and I had to do a re-install of the OS.

No matter... I took a can of compressed air to the heatsink and quite a bit of stuff fell out (yuk!). Then I reduced in-game video-settings to "balanced" and now I'm getting around 85°C on the CPU and also lower temps and load on the GPU.

Sadly, NVDIA control panel doesn't recognize the "playgate.exe" so I have to manually launch it with the GeForce every time I start up the online launcher. But that's not too bad. What bothers me more is the absence of all my peripherals, the tiny screen and the fact that I have to use a silly laptop-keyboard to play infantry.. :D

S.

Share this post


Link to post
Share on other sites

Thanks pete!

There are actually two solutions that work for me:

1. In the install-folder, right click on WW2.exe (offline) or playgate.exe (online) and select "launch with graphics-processor" (or something like that, retranslated from German Windows 10), then select the NVidia-GFX card.

Drawback is you have to do it every time you launch.

2. WW2.exe is recognized by the NVidia control panel (although it seems to think it's a different game, lol) So you can force Windows in the control panel to use the NVidia-card with this exe.

NVidia CP > Program settings > look up "WW2.exe" and select it > under 2. "Choose preferred Graphics-Processor for this program" select the NVidia-card. By default it was set to use the onboard Intel-chip on my system. Click apply.

For online mode, I had to manually add "playgate.exe" to the list.

Under 1., click "add", then browse to the installation folder and select "playgate.exe". Then set it to use the NVidia-card as described above.

S.

Edited by sascha

Share this post


Link to post
Share on other sites

  • Recently Browsing   0 members

    No registered users viewing this page.