test content
What is the Arc Client?
Install Arc

Reporting Performance Problems

13

Comments

  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    Well...its horrible to say that it makes me feel better that I am not alone. My system specs are not high end but between recommended and min required. I can play but its lag city and really doesnt matter if I dummy down my graphics or not. Its not fun. However I seem to have a bit better of a time then some of the higher end system specs here!!!! The devs should no doubt hammer this out. I have to goto work but I'll be back to maybe add my info here.

    Thanks gang for making me feel less...lost in lag town all alone and broke.;)
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    Hopfully they will fix the preformance issues. I get the same fps on medium grafix and low. I figured putting it on low would help but, there seems to be no differance. I was one of the blue/grey screen people that got fixed recently. I hope it's got something to do with that and they just need to optimize it a bit better.
    You's already have my dxdiag but, i'm putting it up again just cause.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    Seive wrote:
    I found turning pre-rendered frames to 0 in the Nvidia control panel helped my framerate. Went from hovering at around 20-25 FPS to 30-35.

    hmmm does ATI have an equivalent option? I can't seem to find it =/
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    I honestly don't know jack diddly about ATI cards. :(
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    Blindmouse wrote:
    hmmm does ATI have an equivalent option? I can't seem to find it =/

    No.

    ..............................
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    I would love to log in and timer recored the performance issues. But I can't even log in past character select screen.

    My champion I created a couple days ago doesn't retrieve a map list.

    My champion I made yesterday stops loading the map half way through (going into Crisis in Canada)

    I want to create a New Champion but it won't let me click "Next" when I select a framework. It won't let me select an energy builder if I select custom framework.

    Anyway to resolve this so I can email you guys with some info for you to fix it?
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    I did a test just for the lulz, my dual core 3ghz e8400 intel runs at the same FPS with dual core mode turned off. It's not using my second core, for certain.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    hmmzz i get 57 fps in power house anywere else its around 20fps.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    I am running the game in 1920x1200 full options except AA to none, and got 10-12 fps in Millenium city center with :

    Intel Core 2 Duo E6850 3Ghz
    4 Go DDR2 Vista 64
    Two 8800 GTX cards in SLI mode.

    I upgrade graphic cards to two 285 GTX cards in SLI mode, and now I have around 22-28 fps (but still falling to 12 fps sometime). It's playable but seems really bad for such a powerful hardware configuration.

    Hope it will be fix soon.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    DaDennis wrote:
    How about SLI? Because I have a quadcore cpu (@ 3GHz) and 2 NVIDIA 285 GTXs in SLI and I get worse performance than with Age of Conan on absolutely max settings. The FPS is still between 25-50 about 95% of the time, but the stuttering the other 5% of the time is annoying nonetheless...

    I don't think the game uses even the power of ONE high-end card.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    yeah my fps is lucky to hit 30, even when im in an instance by myself it sits between 15-30.

    intel core2 duo 3.00ghz
    4gb ram
    4870X2

    two thousand dollar computer and my brother on my old machine runs it just as well.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    here are my system specs.

    Athlon 64 X2 4200+
    2 GB ram
    Powercolor HD 3870 512mb

    i think my system meets the minimum or even recommended system requirements.
    I run the game on recommended settings.
    there are 2 main issues currently
    after some of the latest patches performance improved a little. getting fps of about 20-30 now.
    but in some areas where there are many people. (especially when fighting the huge mega destroid), performance becomes terrible.

    the 2nd issue i'm getting is when i go into dungeons or lairs (instances) whether alone or with a group. i get severe network lag, rubberbanding issues and server not responding messages. but when i go back to the main world, non of this happens.

    hope they get this fixed in time for launch.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    I am fairly certain I meet the minimum requirements and that I just barely meet the recommended specs. I have run the game at its full out max resolution with all the settings as high as they could go (as my vid card supports) to the lowest resolution with almost nothing turned on. In all tests, regardless of what I ran, I came to the same result, rubberband movement with 5 - 14 fps

    Intel Core 2 Duo 2.6 (e6750)
    x4 1 GB sticks of OCZ DDR-2 (PC2 8500)
    GeForce 8800 GT 512 MB PCI-e

    I have reformatted my hard drive and re-installed windows XP SP3 (needed to be done anyways), I re-installed the beta client from file planet, updated all my drivers to the newest releases and ran a few diags on my current setup. What I was able to discover was that the GameClient.exe process seemed to suffer from memory leak issues (increasing memory usage). In game, I was basically getting something in the neighborhood of 3 - 12 fps while logged into any of my toons in any of the instances. Most of my play time revolved around 5 - 10 PM PST.

    During my testing I made sure that I had the most recent video driver relase from nVidia (version 190.62) installed on my machine. It wasn't until I rolled this driver back to release 180.48, that I actually saw my fps jump up to 17 - 30 fps during play. I continued to monitor the GameClient.exe process which was still showing signs of memory leak issues, for approximately 1 hour. During play the "rubber-band" movement issue was still there, but only mildly so. However, the game did perform much better overall. I was going to further test this tonight by upgrading my video drivers to some more recent versions till I find the one that ultimately puts me back to where I started, but I wanted to throw this out there to see if it helps any of you other folks.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    This is funny somehow. As I said earlier, i was running the game with max speed and min performance settings. Today I turned on anything to max performance, wanted to look, if I get 1 or 2 fps ^^. But I was totally impressed that the overall performance was slightly better. Ok, I am still teleporting on the Millenium City map, but with 10 to 20 fps, but it's the same with the lowest performance setting. Someone said earlier that there maybe a memory leak. That may be possible, because the longer I play, the worse it gets. Regardless, this test cancels out my thoughts my machine could be the reason :rolleyes:
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    I'm not going to put my specs up because this thread is filled with people having the same issues. I'm wondering how to get my money back at this point. It's not an easy thing to get a PC game out the door, but hitting open beta with these kinds of performance issues is pretty bad. I didn't expect an OB to be perfect, but this is unplayable. I don't know how there isn't more developer response on the forums to fix this problem. Are they even looking into it, or is this at the bottom of devtrack...
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    I'm not going to put my specs up because this thread is filled with people having the same issues. I'm wondering how to get my money back at this point. It's not an easy thing to get a PC game out the door, but hitting open beta with these kinds of performance issues is pretty bad. I didn't expect an OB to be perfect, but this is unplayable. I don't know how there isn't more developer response on the forums to fix this problem. Are they even looking into it, or is this at the bottom of devtrack...

    Make sure ,if you haven't, to report to them your problem as per the OP of this thread is a DEV and instructions on how to report are in first post. The more we report as a problem (I hope) the higher it goes in a priority to get fixed.

    But I feel ya man what do we do cause I coughed up 50 bucks to play in a beta/game that others got to play beta for free..and I'm sitting here hoping (crosses fingers) that they are even trying to fix this major issue that's affecting so many of us. I mean its not that we are whining that this vs that minor imbalance in the game needs to be tweaked...we are playing and the game and it is borderline unplayable..period. I just cant stand playing in the desert anymore let alone playing in Millennium city the other night and it was at like 1/2 a frame a sec cause soo much action was going one (after the 40 th lvl present). And I have (like most of us) tried all the fixes , scaling down to half the resolution etc to try to get it to work ..or just be playable and still nothing. I see patches going out addressing this or that bug (zombies etc) and haven't seen one post in here by DEV since thread was started like 8 days ago...:( (my bad I see one on 22nd)
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    My Machine:

    Core i7 920@ 3Ghz
    GF 295 GTX(Forceware 190.62)
    SB Live
    2 Raid 0 HDs
    6GB DDR3 1600 Ram
    Vista 64 SP2

    I get about 25-35 FPS with everything on 1920x1080 with 4xMSAA. If i deactivate the shadows and lights i get about 35-60 FPS. But it still feels a little bit sluggish. Any ideas whats going wrong?
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    Lol I wouldn't complain id love that FPS!!!

    Here's my current setup/Rig:

    Operating System: Windows XP Home Edition (5.1, Build 2600) Service Pack 3
    System Manufacturer: Foxconn
    BIOS: Phoenix - AwardBIOS v6.00PG
    Processor: Intel(R) Pentium(R) D CPU 2.80GHz (2 CPUs)
    Memory: 2046MB RAM

    Card name: NVIDIA GeForce 8800 GT
    Manufacturer: NVIDIA
    Chip type: GeForce 8800 GT
    DAC type: Integrated RAMDAC
    Display Memory: 512.0 MB
    Current Mode: 1440 x 900 (32 bit) (60Hz)

    I'm lucky to hit 10 FPS on minimum and i have no idea why!!! My Rig isn't that bad!!

    Thanks, StripySniper
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2009
    Gattix wrote:
    yeah my fps is lucky to hit 30, even when im in an instance by myself it sits between 15-30.

    intel core2 duo 3.00ghz
    4gb ram
    4870X2

    two thousand dollar computer and my brother on my old machine runs it just as well.

    My system is basically the same as yours, minus one gig of ram though. I'd get around late 20s-late 30s majority of the time, and that was with settings at below the half way point, and many of the effects turned off too.

    Are the devs ever gonna throw us a bone here? Like I really am interested in playing, but they haven't come with an concrete help for those who need it. :/
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2009
    Like most everyone else here my fps went way down after the 9/10 patch. it's still that way after the 9/11 patch. Before that things were running great.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2009
    Huskeonkel wrote:
    Like most everyone else here my fps went way down after the 9/10 patch. it's still that way after the 9/11 patch. Before that things were running great.

    I had the same problem until I turned off ambient occlusion, and then my FPS shot right back up again. I also keep shadows on low.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2009
    Im running a AMD pehnom II X4 810 processor
    2 gigs ram
    Radeon4600 series Graphics card with 1 gig of memory
    plenty of hard drive space etc etc and still get realy low fps however this is ONLY when i group with other players.

    As of the last patch fps when solo has improved some except when i enter crowded areas but its playable.. In groups like doing burial mound or vipers nest its almost unplayable still can only group with a max of two people and run with graphics bottomed out (but that still only makes a slight ammount of diffrence). Realy need to look into the lag / rubber banding issue when in groups its keeping me from realy enjoying the game experience and after a while ill get bored running solo.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2009
    FPS also went way down for me some time this week. Had not played since 9/8, logged in tonight and performance was noticeably different.

    My specs:

    CPU - AMD Athlon 64 X2 5000+
    RAM - 3GB DDR2
    GPU - Nvidia GeForce 8500GT - 512MB
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2009
    still no FPS change for me, it used to suck for the last 2 weeks, now it sucks the same.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2009
    fps was bad but tolerable before but since the last patch it is now pretty much unplayable :(

    CPU - Intel core duo T7300 2.0 ghz
    Ram - 3gig
    GPU - nvidia geforce 8600GT
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited November 2009
    Not Having Problems here just noting cpu usage. Only problem I see is the extreme usage of the cpu!

    game runs great for me

    Im using Vista 64bit
    Processor Intel(R) Core(TM)2 Extreme CPU X9650 @ 3.00GHz

    Cpu usage is 65% most of the time and jumps often to around 85% thats crazy!
    I have a logitech G15 keyboard I can see me cpu usage live on it

    Memory (RAM) 8.00 GB DDR2 client uses about 1gb not a problem

    Graphics NVIDIA GeForce 9800 GX2 Resolution 1360x768 max on my tv
    27-33 FPS Most of the time with high settings
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited December 2009
    :mad:Look . . . i am going to be nice in this post. there 374 players using the gtx 295 that have this 0 fps drop off ever few second. . . this is what i call it because if u use fraps free or not it shows fps on screen and u will see it go to 0 then right back up to your fps. Me i have 2 gtx 295's in my comp 12 gigs ddr3 and an i7 at 3.0 ghz. I have even tried i7 920 at its normal GHz and one gtx 295 and 1 gig of ram and same. . . what bad is we can’t get in contact with the dev's of this game no one can and no one will listen the support team does not care what we say they only do what they want. This game does not support gtx 295's not matter what and they are not working on it and will not work on it because they want to make sure everyone with a 8800 gt and below get their content. Why? Because most users who play have low end machines and use with high end machines they don’t think we would be playing this but only playing games like modern warfare 2 and things like that. if you read all post there is nothing from one dev not ONE that even speaks or even says anything about the gts 295. I have pay for the life time subscription and have lost my money. it’s sad a graphics cars 5 years older than the gtx 295 can play the game better then we can.:mad:
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited December 2009
    jehutyy wrote:
    :mad:Look . . . i am going to be nice in this post. there 374 players using the gtx 295 that have this 0 fps drop off ever few second. . . this is what i call it because if u use fraps free or not it shows fps on screen and u will see it go to 0 then right back up to your fps. Me i have 2 gtx 295's in my comp 12 gigs ddr3 and an i7 at 3.0 ghz. I have even tried i7 920 at its normal GHz and one gtx 295 and 1 gig of ram and same. . . what bad is we can’t get in contact with the dev's of this game no one can and no one will listen the support team does not care what we say they only do what they want. This game does not support gtx 295's not matter what and they are not working on it and will not work on it because they want to make sure everyone with a 8800 gt and below get their content. Why? Because most users who play have low end machines and use with high end machines they don’t think we would be playing this but only playing games like modern warfare 2 and things like that. if you read all post there is nothing from one dev not ONE that even speaks or even says anything about the gts 295. I have pay for the life time subscription and have lost my money. it’s sad a graphics cars 5 years older than the gtx 295 can play the game better then we can.:mad:

    /snark on

    So go buy yourself an 8800 and quit yer bellyaching. You certainly can afford it.

    /snark off
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited March 2010
    jehutyy wrote:
    :mad:Look . . . i am going to be nice in this post. there 374 players using the gtx 295 that have this 0 fps drop off ever few second. . . this is what i call it because if u use fraps free or not it shows fps on screen and u will see it go to 0 then right back up to your fps. Me i have 2 gtx 295's in my comp 12 gigs ddr3 and an i7 at 3.0 ghz. I have even tried i7 920 at its normal GHz and one gtx 295 and 1 gig of ram and same. . . what bad is we can’t get in contact with the dev's of this game no one can and no one will listen the support team does not care what we say they only do what they want. This game does not support gtx 295's not matter what and they are not working on it and will not work on it because they want to make sure everyone with a 8800 gt and below get their content. Why? Because most users who play have low end machines and use with high end machines they don’t think we would be playing this but only playing games like modern warfare 2 and things like that. if you read all post there is nothing from one dev not ONE that even speaks or even says anything about the gts 295. I have pay for the life time subscription and have lost my money. it’s sad a graphics cars 5 years older than the gtx 295 can play the game better then we can.:mad:

    So is it a specific problem with the GTX 295s? Cause I have i7 965 6GBs of RAM and 2xGTX 295. It ran fine when I left playing the game right after the halloween event. But now anything other than LOW settings gets me wildly varying frame rates and extremely choppy play. I mean if you look at my FRAPS graph its like an oscillating curve.

    My question is simple, did the kitchen sink break this? I ask simply because I was able to play properly before, but now it is unplayable.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    i got choppy framrate to jumps between 0 and 50 even when standing still i got a geforce 8800GTS

    iw noticed that my cpu usages is only 20% and my card runs hot
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    Attach your DxDiag file and we can see if something is obviously wrong with your system.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    I got some overheating problem with CO, is the only game that bring my VGA over 90°.

    Quick cfg:
    Q9550@3,4ghz
    GTX295
    Asus Rampage
    4gb DDR2 Cl4
    1920x1080 resolution

    PC is clean ( i'm a maniac), big case (HAF-932), right management of fans, latest drivers, V-sync on.
    VGA never goes up to 82° with :

    Stalker COP
    Battlefield BC2
    Dragon Age
    Danw of war 2
    Bioshock
    Age Of Conan DX10

    I've tryed to disable/enable some effects without results, anyone with the same overheat problem ? :(


    P.S. I can't attach Dxdiag, the board tell me that it's too big :(
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    You can put your DxDiag file in a .zip file and then attach it. Or you can spread it across several files and attach them all.

    If you're overclocking considerably and have an overheating problem, then the first thing to try is restoring the stock speed. Certainly, it's your processor that is overclocked and your video card that is overheating, but you didn't say if you've overclocked your video card as well.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    VGA has stock settings :p

    Here Dxdiag ( thx 4 the tip )
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    I don't see anything wrong in your DxDiag file. If you've tested with other games, then the problem might be as simple as Champions Online pushing your video card harder than most other games. This game does tend to do that to Nvidia cards (but not ATI cards), for whatever reason.

    Perhaps the real moral of the story is that, apart from liquid cooling, cards with a TDP in the neighborhood of 300 W are prone to overheat even if the user does everything right. Depending on your frame rate, you may be able to reduce video card load by turning vertical sync on. You can certainly reduce it by using /maxfps. There's also an option in the game to reduce video card load. Or you could underclock the card. I'm not sure if you can disable one of the GPUs and just run it like a GTX 275, but if you can, that would be another option.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    Vsync is already on, btw i'm going to test untill i find the culprit, i'm almost shure it's caused by some setting too high ( i pushed up a little bit settings ), stay tuned :p
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    Zuxx wrote:
    Vsync is already on, btw i'm going to test untill i find the culprit, i'm almost shure it's caused by some setting too high ( i pushed up a little bit settings ), stay tuned :p

    You could also try forcing your fan to 100%, just in case it's not doing it automatically. One reason the game pushes Nvidia hardware so much is due to PhysX -- something that gets done in software with ATI cards.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    You could also try forcing your fan to 100%, just in case it's not doing it automatically. One reason the game pushes Nvidia hardware so much is due to PhysX -- something that gets done in software with ATI cards.

    Nonsense. In any reasonably balanced setup, it gets done on the CPU, not the video card, regardless of the video card brand. If your video card is already pushed to the limit while the processor is half idle, moving computations that a processor does well off of the CPU and onto the video card is a stupid thing to do. The only reasons to run physics computations on the video card are if you have a woefully inadequate processor (e.g., a Pentium 4) paired with a very powerful video card, or if Nvidia paid the game developers a bunch of money to turn PhysX computational demands high enough that it's better described as a benchmark than a game. The latter doesn't describe Champions Online, and the former doesn't describe Zuxx's processor.

    Besides, if PhysX pushed a video card all that hard, why aren't there constant stories of dedicated PhysX cards overheating in the games that use it extensively? If a card has does as much PhysX computations as 3D rendering, it doesn't mean the card works twice as hard. It means it cuts your frame rate in half, and spends half of its time doing PhysX and the other half doing 3D rendering. If PhysX only doesn't push the card as hard as 3D rendering only, then that actually eases the load on the video card. Indeed, this last scenario is likely, as I doubt that PhysX can make use of TMUs and ROPs, though that's just speculation on my part.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    Quaternion wrote:
    Nonsense. In any reasonably balanced setup, it gets done on the CPU, not the video card, regardless of the video card brand. If your video card is already pushed to the limit while the processor is half idle, moving computations that a processor does well off of the CPU and onto the video card is a stupid thing to do. The only reasons to run physics computations on the video card are if you have a woefully inadequate processor (e.g., a Pentium 4) paired with a very powerful video card, or if Nvidia paid the game developers a bunch of money to turn PhysX computational demands high enough that it's better described as a benchmark than a game. The latter doesn't describe Champions Online, and the former doesn't describe Zuxx's processor.

    AFAIK the drivers won't switch from GPU to CPU processing on the fly. It's one or the other. And which you should choose is more about what games you're playing, not necessarily your hardware. Some games push the CPU hard, some the GPU. Making broad generalizations is not really accurate, cause not all games perform the same way.
    Besides, if PhysX pushed a video card all that hard, why aren't there constant stories of dedicated PhysX cards overheating in the games that use it extensively?

    Because PhysX only uses certain parts of the hardware, namely the shader units. The rest of the hardware (texture units, ROPS, much of the VRAM) goes virtually unused and therefore doesn't generate any significant heat.
    If a card has does as much PhysX computations as 3D rendering, it doesn't mean the card works twice as hard. It means it cuts your frame rate in half, and spends half of its time doing PhysX and the other half doing 3D rendering. If PhysX only doesn't push the card as hard as 3D rendering only, then that actually eases the load on the video card. Indeed, this last scenario is likely, as I doubt that PhysX can make use of TMUs and ROPs, though that's just speculation on my part.

    Again, more generalizations. It depends upon the game and the PhysX implementation. Let's say for example that CO uses 65% of your GPU for game rendering and 25% for PhysX. If you turn off PhysX or force it into software your GPU is going to run at 65%. Set PhysX to another card and it's still 65%. Set it to the main card and now it's usage will climb to 90%. You're using one card to run two separate processes, which is definitely going to push the card harder than just one.

    CO uses a fairly minimal amount of PhysX, but it's enough to push a single card much higher than it would go otherwise. Which will cause temps on that card to rise.

    The other part of the equation is headroom. If you're using vsync or the /maxfps setting you may not realize your card is doing a lot more work with GPU PhysX instead of CPU PhysX because in both cases the card is still able to render 60 FPS. It's just working a lot harder to do it. So temps rise.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    And which you should choose is more about what games you're playing, not necessarily your hardware. Some games push the CPU hard, some the GPU.

    And how many games with GPU PhysX are there that are processor bound with a reasonably balanced system and PhysX running on the GPU? Can you name one? Until any such games actually exist (which may or may not ever happen), they're not an important consideration.
    Because PhysX only uses certain parts of the hardware, namely the shader units. The rest of the hardware (texture units, ROPS, much of the VRAM) goes virtually unused and therefore doesn't generate any significant heat.

    Bingo. So why are you arguing that for a card to stop 3D rendering and focus on PhysX instead will make the card run hotter?
    Again, more generalizations. It depends upon the game and the PhysX implementation. Let's say for example that CO uses 65% of your GPU for game rendering and 25% for PhysX. If you turn off PhysX or force it into software your GPU is going to run at 65%. Set PhysX to another card and it's still 65%. Set it to the main card and now it's usage will climb to 90%. You're using one card to run two separate processes, which is definitely going to push the card harder than just one.

    Except that there don't exist GPUs that can do both at once. In order to do any PhysX at all, a GPU has to completely stop rendering 3D stuff, start doing some PhysX computations, finish the PhysX computations, stop doing PhysX, and return to 3D rendering. One of Nvidia's selling points of their new Thermi cards (yes, they're bad enough to earn a derisive nickname) is that it can switch back and forth between them much faster than their previous cards, so you don't get the choppy frame rates of trying to do both on a single card.

    If a card would be putting out 150 W while doing 3D rendering or 100 W for PhysX, but you change it to instead put out 150 W half of the time and 100 W the other half, that gives you an average power consumption of 125 W, which is less than 150 W.
    The other part of the equation is headroom. If you're using vsync or the /maxfps setting you may not realize your card is doing a lot more work with GPU PhysX instead of CPU PhysX because in both cases the card is still able to render 60 FPS. It's just working a lot harder to do it. So temps rise.

    Yes, if a card is idle a considerable chunk of the time, then doing PhysX on the card as well will make the card run hotter. But if your video card is idle a considerable chunk of the time and still overheating, the card has serious problems.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    Quaternion wrote:
    And how many games with GPU PhysX are there that are processor bound with a reasonably balanced system and PhysX running on the GPU? Can you name one? Until any such games actually exist (which may or may not ever happen), they're not an important consideration.

    Batman Arkham Asylum will drive your CPU into the ground with software PhysX on high.

    Bingo. So why are you arguing that for a card to stop 3D rendering and focus on PhysX instead will make the card run hotter?

    Except that there don't exist GPUs that can do both at once. In order to do any PhysX at all, a GPU has to completely stop rendering 3D stuff, start doing some PhysX computations, finish the PhysX computations, stop doing PhysX, and return to 3D rendering. One of Nvidia's selling points of their new Thermi cards (yes, they're bad enough to earn a derisive nickname) is that it can switch back and forth between them much faster than their previous cards, so you don't get the choppy frame rates of trying to do both on a single card.

    But they do do both at once, just in different parts of the card. PhysX only uses the shader units. The rest of the card is off doing 3D rendering. Even if there were a complete disconnect between both modes, doing both still means the card is doing more work and will generate more heat. In normal 3D rendering the card will produce 60 FPS, though it may be capable of doing 100, 200, or more. So there is a lot of downtime that is then used for PhysX. The card is going to work harder trying to do 2 things at once. This is pretty basic stuff here, I don't understand why you're having trouble grasping it.
    Yes, if a card is idle a considerable chunk of the time, then doing PhysX on the card as well will make the card run hotter.

    Exactly.
    But if your video card is idle a considerable chunk of the time and still overheating, the card has serious problems.

    True, but that's an entirely different issue.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    Batman Arkham Asylum will drive your CPU into the ground with software PhysX on high.

    That's why I said with PhysX done on the GPU. This doesn't look terribly processor-limited to me:

    http://www.anandtech.com/show/2841/24

    Granted, that review uses an overclocked Core i7, but if you're still GPU-bound at well over 200 frames per second, any reasonable processor can deliver a plenty fast enough frame rate, and still have plenty of power leftover for physics computations.

    So Nvidia paid the company a bunch of money to make the PhysX unreasonably demanding to the point that maxing PhysX is more a synthetic benchmark than a game. And also to artificially disable anti-aliasing on ATI cards, for good measure. But even in the best edge case you can find, there's plenty of processing power for a reasonable amount of physics computations. Which is precisely my point.
    But they do do both at once, just in different parts of the card. PhysX only uses the shader units. The rest of the card is off doing 3D rendering. Even if there were a complete disconnect between both modes, doing both still means the card is doing more work and will generate more heat. In normal 3D rendering the card will produce 60 FPS, though it may be capable of doing 100, 200, or more. So there is a lot of downtime that is then used for PhysX.

    Except that it can't do both at once. Why do you think video cards had shaders in the first place? It wasn't for physics computations; they're needed for 3D rendering. The game can switch back and forth every few milliseconds, and at a macroscopic level look like it's doing both at once. But for thermal purposes, that's very much doing only one thing at a time.

    Cards overheating isn't caused by the card being completely idle 20% of the time rather than 30% of the time and slightly decreasing the average power usage while not idle. Most games won't make a card overheat even if the card is idle 0% of the time. Actually, if a card is built well enough, no real games should make it overheat. But the GeForce GTX 295 is one of those weird cards that never should have existed, as the only real point is for Nvidia to be able to say it exists and point to it on benchmark lists, and not for people to actually use it.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    Quaternion wrote:
    That's why I said with PhysX done on the GPU. This doesn't look terribly processor-limited to me:

    http://www.anandtech.com/show/2841/24

    Granted, that review uses an overclocked Core i7, but if you're still GPU-bound at well over 200 frames per second, any reasonable processor can deliver a plenty fast enough frame rate, and still have plenty of power leftover for physics computations.

    So Nvidia paid the company a bunch of money to make the PhysX unreasonably demanding to the point that maxing PhysX is more a synthetic benchmark than a game. And also to artificially disable anti-aliasing on ATI cards, for good measure. But even in the best edge case you can find, there's plenty of processing power for a reasonable amount of physics computations. Which is precisely my point.



    Except that it can't do both at once. Why do you think video cards had shaders in the first place? It wasn't for physics computations; they're needed for 3D rendering. The game can switch back and forth every few milliseconds, and at a macroscopic level look like it's doing both at once. But for thermal purposes, that's very much doing only one thing at a time.

    Cards overheating isn't caused by the card being completely idle 20% of the time rather than 30% of the time and slightly decreasing the average power usage while not idle. Most games won't make a card overheat even if the card is idle 0% of the time. Actually, if a card is built well enough, no real games should make it overheat. But the GeForce GTX 295 is one of those weird cards that never should have existed, as the only real point is for Nvidia to be able to say it exists and point to it on benchmark lists, and not for people to actually use it.

    Dude, you're just wrong. I've tried to explain it to you in the simplest way possible but you're just not listening.

    CPUs are horribly inefficient at PhysX processing, like an order of magnitude or more. Compounding the problem is that having poor PhysX performance will bring down your FPS even if the GPU can easily keep up. This is because the entire system has to wait for the PhysX data so it can be inserted into the 3D pipeline. You don't want to do CPU PhysX unless you have no other choice.

    And while the shaders can't do two things at once the cards have more than just shaders. While the shaders are processing PhysX data the texture and raster units are processing the last batch of 3D data from the shaders before they switched over. It's the whole nature of massively parallel and pipelined architecture.

    You're right tho, properly cooled cards should never overheat. If they are there is a cooling problem. Sometimes it's a problem with the rig itself (not enough fans, blocked fans) and sometimes it's a manufacturing problem (inadequate thermal solution on the card, especially if its factory overclocked).
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    CPUs are horribly inefficient at PhysX processing, like an order of magnitude or more. Compounding the problem is that having poor PhysX performance will bring down your FPS even if the GPU can easily keep up. This is because the entire system has to wait for the PhysX data so it can be inserted into the 3D pipeline. You don't want to do CPU PhysX unless you have no other choice.

    Suppose that a game is tessellated to the point of having millions of polygons. It then tries to run the game at a resolution of 7680x3200, and with 16x SSAA on, for good measure. With any video card on the market, that would likely crash entirely, or at best have an unplayably slow frame rate, more readily expressed in seconds per frame than frames per second. Does that mean that video cards aren't good at 3D rendering?

    Of course not. It only means that if you want the game to run properly, you have to keep the graphical settings reasonable. And so it is with physics computations. If you design a game with wholly unreasonable settings, then yes, you can overwhelm a processor. Nvidia has paid several game developers to at least give players the option to do that, to give them a talking point on why people should buy an Nvidia card in spite of losing badly on performance per watt, and losing badly on performance per dollar in the $150+ segment of the market.

    PhysX was around and used in many games long before any video cards could run it. It ran on a processor, and worked just fine. We have far more powerful processors now, so it can run on a processor and still work just fine if the game is designed for it.

    Yes, I'm aware that the top video cards have raw computational power of an order of magnitude or more better than the top processors, at least if the code is something that video cards can handle well. Computers have memory bandwidth about two orders of magnitude greater than hard drive bandwidth. That doesn't mean that games should avoid using a hard drive, but only that they should design around the limitations of hard drives.
    While the shaders are processing PhysX data the texture and raster units are processing the last batch of 3D data from the shaders before they switched over.

    Nvidia's architectures are fairly shader-limited to begin with. A Cypress has 1600 shaders, compared to 240 for a GT200b. Meanwhile, they have the same numbers of 80 TMUs and 32 ROPs. Nvidia does clock the shaders at more than double the speed of the other parts, but the shader to TMU processing power ratio for GT200b is about 1/3 what it is for Cypress. The same applies for the shader to ROP ratio. If the TMUs and ROPs are waiting on shaders even while the card is only doing 3D rendering, then mixing in workload that uses only shaders isn't going to increase shader workload, since they're already maxed out. It will decrease TMU and ROP usage, though. GT200b has more memory bandwidth than it can make use of, so that's not the limiting factor, either.

    What you describe might make sense on a recent AMD card, which tend to have too many shaders and leave them waiting on other components. But AMD cards don't run the PhysX API.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    Quaternion wrote:
    Suppose that a game is tessellated to the point of having millions of polygons. It then tries to run the game at a resolution of 7680x3200, and with 16x SSAA on, for good measure. With any video card on the market, that would likely crash entirely, or at best have an unplayably slow frame rate, more readily expressed in seconds per frame than frames per second. Does that mean that video cards aren't good at 3D rendering?

    Of course not. It only means that if you want the game to run properly, you have to keep the graphical settings reasonable. And so it is with physics computations. If you design a game with wholly unreasonable settings, then yes, you can overwhelm a processor. Nvidia has paid several game developers to at least give players the option to do that, to give them a talking point on why people should buy an Nvidia card in spite of losing badly on performance per watt, and losing badly on performance per dollar in the $150+ segment of the market.

    PhysX was around and used in many games long before any video cards could run it. It ran on a processor, and worked just fine. We have far more powerful processors now, so it can run on a processor and still work just fine if the game is designed for it.

    Yes, I'm aware that the top video cards have raw computational power of an order of magnitude or more better than the top processors, at least if the code is something that video cards can handle well. Computers have memory bandwidth about two orders of magnitude greater than hard drive bandwidth. That doesn't mean that games should avoid using a hard drive, but only that they should design around the limitations of hard drives.



    Nvidia's architectures are fairly shader-limited to begin with. A Cypress has 1600 shaders, compared to 240 for a GT200b. Meanwhile, they have the same numbers of 80 TMUs and 32 ROPs. Nvidia does clock the shaders at more than double the speed of the other parts, but the shader to TMU processing power ratio for GT200b is about 1/3 what it is for Cypress. The same applies for the shader to ROP ratio. If the TMUs and ROPs are waiting on shaders even while the card is only doing 3D rendering, then mixing in workload that uses only shaders isn't going to increase shader workload, since they're already maxed out. It will decrease TMU and ROP usage, though. GT200b has more memory bandwidth than it can make use of, so that's not the limiting factor, either.

    What you describe might make sense on a recent AMD card, which tend to have too many shaders and leave them waiting on other components. But AMD cards don't run the PhysX API.

    You can't compare ATI shaders to Nvidia shaders. They are not the same. This is clear because if they were then Cypress cards would be almost 7x faster than GT200s. And we know that's not the case. The bottom line is this (and here's where my portion of this discussion is over cause I've already explained it 3 times now): having the same card do 3D rendering and PhysX processing is going to make the card do more work and run hotter. It's as simple as that. You might choose to not believe it, but you'd be fooling yourself.

    And I still don't get why you're advocating running PhysX on the CPU? Honestly, it's retarded. Most games won't even let you do it, and if they do they set the PhysX level to an absolute minimum. You're certainly not doing yourself any favors.

    I feel like I'm in Bizarro forums. I'm not even sure what you're arguing with me about anymore.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    And I still don't get why you're advocating running PhysX on the CPU? Honestly, it's retarded. Most games won't even let you do it, and if they do they set the PhysX level to an absolute minimum. You're certainly not doing yourself any favors.

    All games that use PhysX allow it to run on the CPU. Most only allow it to run on the CPU, and won't let it run on a video card. For that matter, most people don't have a video card that can run PhysX on the video card, so it has to run on the CPU or else the game won't run at all.

    But it's simple, really. Run Champions Online with PhysX running on the CPU. Open Task Manager and go to the Performance tab. For me, it shows around 20%-30% CPU usage. So even if I could, why would I want to take work away from the processor and give it to the video card, which already has all it can handle?
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited April 2010
    Quaternion wrote:
    All games that use PhysX allow it to run on the CPU. Most only allow it to run on the CPU, and won't let it run on a video card. For that matter, most people don't have a video card that can run PhysX on the video card, so it has to run on the CPU or else the game won't run at all.

    But it's simple, really. Run Champions Online with PhysX running on the CPU. Open Task Manager and go to the Performance tab. For me, it shows around 20%-30% CPU usage. So even if I could, why would I want to take work away from the processor and give it to the video card, which already has all it can handle?

    Whatever makes you happy dude. You're in dreamland. But if it works for you more power to ya.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited November 2010
    Well, after I took about a year off from CO I thought I would give it another go to see if the performance issues had been settled.

    This time I ran the game on two different machines, my desktop and alienware m15x latop.

    Nothing has change. The performance is still garbage.

    The only conclusion I can come to is that CO is just junk software - it's really that simple.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited November 2010
    What are your performance issues? And attach the DxDiag file for both machines so we can see if you've got hardware that should run it well.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited November 2010
    Performance is fine here, on par with 2 other MMOs.
Sign In or Register to comment.