test content
What is the Arc Client?
Install Arc

No SLi usage, single GPU only?

jnohdjnohd Posts: 5 Arc User
edited September 2014 in PC & Technical Issues
Noticed this for the first time today, after a monitor change:

Br_Q0UnCUAEgvJg.jpg

Precision X reporting no usage of GPU2 for CO. I'm also using the 340.43 beta driver, but can't roll back or other games have issues on this machine, unless I wind up swapping drivers every time I switch games.

STO and NW seem fine under 340.43.

The only other changes made since the last time I played besides the driver is the monitor (LG 34UM95 @3440x1440).

Anyone else running 340.43 and SLi? Is the game using both GPUs?

Is there a multi-GPU command in game I can try to force multi-GPU usage with? (nothing in the UI)

Or a particular setting I should change in Nvidia control panel? (its currently set for the default of Global: NVIDIA recommended, CO's Program Settings: Use Global)
Wampaq@Jnoh, Fleet Leader: ..Bloodbath and Beyond [SIGPIC][/SIGPIC] 'Iw HaH je Hoch!
ALL HOLDINGS FINISHED! - Starbase 5-5-5-5 || Embassy 3-3-3 || Mine 3-3-3 || Spire 3-3-3
A laid back KDF fleet welcoming independent, casual, & part-time players and groups. Roms & alts welcome.
Send in-game mail to Wampaq@Jnoh, visit our recruitment thread and FB page for more info.
Post edited by jnohd on

Comments

  • jnohdjnohd Posts: 5 Arc User
    edited July 2014
    New Nvidia Driver today (340.52).

    CO still uses only 1 GPU, heats 1 GPU, NOT both, Despite "Nvidia Reccomended" SLi setting for Global and CO gameclient.exe setting for Global.

    Again, this does not happen in STO or NW, both continue to use doaul GPU in SLi. Other games also load evenly, this is not an issue with the GPUs or the bridge. Either lies in a change to the Nvidia Recommended setting for CO, or in CO itself.
    Wampaq@Jnoh, Fleet Leader: ..Bloodbath and Beyond [SIGPIC][/SIGPIC] 'Iw HaH je Hoch!
    ALL HOLDINGS FINISHED! - Starbase 5-5-5-5 || Embassy 3-3-3 || Mine 3-3-3 || Spire 3-3-3
    A laid back KDF fleet welcoming independent, casual, & part-time players and groups. Roms & alts welcome.
    Send in-game mail to Wampaq@Jnoh, visit our recruitment thread and FB page for more info.
  • championshewolfchampionshewolf Posts: 4,375 Arc User
    edited July 2014
    Champions hasn't been set up for SLi yet as far as I know. Dx11 is still in its beta state, so the engine hasn't been upgraded for it basically.
    Champions Online player since September of 2008, forumite since February of 2008.
    Silverspar on PRIMUS
    Get the Forums Enhancement Extension!
  • quotablequotable Posts: 29 Arc User
    edited September 2014
    SLI, like CrossFire, is handled by video driver magic. It's not something that game designers can implement directly. While it is possible for a game developer to go out of his way to break SLI and CrossFire, it's unlikely that Cryptic would have done so accidentally.

    If you're using a frame rate limiter of any sort (whether vertical sync or in the troubleshooting options to keep power consumption down), that can mess with multi-GPU setups, though. If the primary GPU is done rendering a frame before the next frame is allowed to start, having the primary GPU render the next frame is the sensible thing to do. If this happens every frame, only one GPU does anything. I'm not sure if drivers will try to make that happen, but it would be the smart thing to do.

    Both AMD and Nvidia put their focus on making multi-GPU cards work with newer games and newer video cards, especially of the sort likely to show up in benchmarks in hardware reviews, so you're very beholden to your GPU vendor for driver updates.

    Furthermore, since multi-GPU setups add latency, you need much higher frame rates than a single GPU would offer in order to justify using it. If an SLI setup offers you 10% higher frame rates than you'd get from just a single card, you'll get a better gaming experience by disabling SLI and just using the single card. The goal is a better gameplay experience, not merely having two cards putting out a bunch of heat instead of one.

    You say that SLI works with STO and Neverwinter; I'd recommend trying disabling them and using a single card and seeing how that changes your frame rate. If it doesn't cause a huge drop in frame rates, then I'd content that SLI isn't doing anything worthwhile in those other games, either.

    What cards do you have, anyway? From your video memory amount, my guess is either GTX 470s or GTX 570s. If that's the case, running SLI on older cards in an older game is asking for driver trouble.

    Quaternion from the previous forum
  • misfitfiend138misfitfiend138 Posts: 3 Arc User
    edited September 2014
    I'm running a Crossfire rig and it's been working fine for me with no hassles at all:

    15yec7.jpg

    Are you sure your not running in windowed mode? Both Crossfire and SLI need to run in full screen mode to work, any form of being windowed and it will cancel out one of the GPU's.
  • jnohdjnohd Posts: 5 Arc User
    edited September 2014
    20140920_094028_zpsxgnnqaoj.jpg

    Full screen, v-sync off, no frame limit. Still no second gpu usage.

    Cards are both EVGA 780ti Classifieds, driver is up to date (new GeForce experience and driver the last few days, actually.) Driving an LG 34um95 at 3440x1440. Is purdy.

    Not even going to get into the suggestion that I be "content" with a bug that limits my hardware usage. Save that for people deciding if they want to upgrade to a second card.
    Wampaq@Jnoh, Fleet Leader: ..Bloodbath and Beyond [SIGPIC][/SIGPIC] 'Iw HaH je Hoch!
    ALL HOLDINGS FINISHED! - Starbase 5-5-5-5 || Embassy 3-3-3 || Mine 3-3-3 || Spire 3-3-3
    A laid back KDF fleet welcoming independent, casual, & part-time players and groups. Roms & alts welcome.
    Send in-game mail to Wampaq@Jnoh, visit our recruitment thread and FB page for more info.
  • jnohdjnohd Posts: 5 Arc User
    edited September 2014
    Champions hasn't been set up for SLi yet as far as I know. Dx11 is still in its beta state, so the engine hasn't been upgraded for it basically.

    Ah, this is a useeful reply! You're saying CO isn't set for SLI under DX11, specifically?
    Wampaq@Jnoh, Fleet Leader: ..Bloodbath and Beyond [SIGPIC][/SIGPIC] 'Iw HaH je Hoch!
    ALL HOLDINGS FINISHED! - Starbase 5-5-5-5 || Embassy 3-3-3 || Mine 3-3-3 || Spire 3-3-3
    A laid back KDF fleet welcoming independent, casual, & part-time players and groups. Roms & alts welcome.
    Send in-game mail to Wampaq@Jnoh, visit our recruitment thread and FB page for more info.
  • quotablequotable Posts: 29 Arc User
    edited September 2014
    I'm not entirely sure what the picture above is. It looks more like a photograph than a screenshot.

    I'm guessing that the numbers are GPU temperature, GPU load, GPU fan speed, video memory used, average frame rate, and latest frame time. But that's just a guess.

    As I said above, SLI, like CrossFire, is implemented purely through video driver magic. Game developers implementing it directly has nothing to do with it. You don't write shaders or implement API calls that say "now turn SLI on". Only the video drivers can do that. Blaming Cryptic for SLI not being used is pointing fingers in the wrong direction, unless you expect them to write their own video drivers for every GPU to ever exist.


    I'd suggest that you back up and ask what is the goal. Is the goal to make your video cards run as hot as possible and heat up your room as much as possible? Or is the goal to get high frame rates for smooth gameplay? If you're getting 140 frames per second on a single GPU, I'd question why you want the second GPU to kick in.

    As I said above, if you're not meaningfully GPU bound, there's no benefit to having a second GPU get used. All that bouncing between GPUs can possibly do in that situation is to slow down your frame rate, add display latency, add power draw and heat output, and add weird driver glitches. None of those are good things.

    If you want to experiment to see if that is the issue, you could try changing your settings to make sure that you're GPU bound. Conveniently, Champions Online has a feature to add arbitrarily amounts of GPU load without touching the CPU load. It's called renderscale. It's called from the chat box, and the syntax is:

    /renderscale 2

    Or some other number instead of 2. A number of 1 is normal, 2 is equivalent to 4x SSAA, 3 is equivalent to 9x SSAA, and so forth. Non-integer values are also allowed, and values below 1 make it clearer what it is doing. Turn that high enough that your frame rate drops markedly and you're definitely GPU bound. See what happens to the GPU load in that case.

    Quaternion from the previous forum
  • chalupaoffurychalupaoffury Posts: 2,553 Arc User
    edited September 2014
    quotable wrote: »
    SLI, like CrossFire, is handled by video driver magic. It's not something that game designers can implement directly. While it is possible for a game developer to go out of his way to break SLI and CrossFire, it's unlikely that Cryptic would have done so accidentally.

    I find your lack of faith disturbing.
    In game, I am @EvilTaco. Happily killing purple gang members since May 2008.
    dbnzfo.png
    RIP Caine
Sign In or Register to comment.