I run a BFG GeForce 285 SC 2 Gig. With the Nvida controls set to Application Preferences, and with the game at 75% non advanced Video, I get around 20-30 FPS depending where I am and what I am doing.
It change to 50-60+ once I turned off Post Processing. I just suspect that each frame is spending naturally longer in the GPU as these "post effects" are being applied (ie comic ounlines). If you need a graphic boost, I would suggest turning that feature off. I also can bump that FPS at 50+ FPS with texture and lighting at max with 4x AA quite easily at 1920x1080.
I know this question is subjective to taste but I am sorry to say NO, the game does not look good to me with the Postprocessing effects turned off.
I also have the comic-outline thingy turned off as well and even though i am able to play the game it looks like crap.
WIth the post-effects turned off i don't see any detail on my superpowers and they all look like the same blurry cloud.
I don't know if this is a good sacrifice to get the FPS i want.
In addition to the above, I got an FPS boost by disabling the Framerate Stabalizer options under the Troubleshooting Options (I'm running two 7950 GTs in SLI mode and forcing Split Frame Rendering - and with all settings maxed got a 8 FPS boost in large open areas of MC and Project Green Skin with lots of people about. I have an Intel Dual Core running @ 2.4GHz; and 2 GB ram on WinXP SP3, so I'm not that far behind in that area of hardware. I'm also running on some older nVidia drivers (the ones before they began including the PhysX addon software as neither of those cards hare that hardware; and evey newerdriver since the ones I'm currently still using caused a framerate DECREASE on the games I play. I'm actually pretty amazed at the performance I've gotten from these older cards in Champions Online.
I don't know if this tweak would create an improvement for Higher end cards/non SLI-modes; but it's just another tweak to try. If others are trying SLI configs, I used the City of Heros profile as a base, since at it heart, it's still the same GFX engine.
You'd loose the comic visual style by turning off post processing. The game starts to look like some other hero game when you turn that off. AA smooths out lines by averaging pixels around it so edges don't have that ladder effect where you could start seeing pixels, but rather are smoothed out. Turning off AA does increase performance as it saves the GPU time with rendering out the frame. Post processing saves the GPU time also for the same reason. I guess it comes down to what your happy with personally.
I was playing UT3 the other day, trying to figure out what that game is doing differently from CO, and why it had such a good frame rate looking at them side by side. UT3 for the most part your in a room, where CO you could view a whole city for one, and this post processing was added. Cutting that off and I started to see similar performance between the two games.
I run a BFG GeForce 285 SC 2 Gig. With the Nvida controls set to Application Preferences, and with the game at 75% non advanced Video, I get around 20-30 FPS depending where I am and what I am doing.
That is so messed up.. you should be able to run this game maxed out on everything at 60fps with that kind of card. they need to fix this bad...
People have been saying the game is putting more strain on the CPU than it should and the GPU is doing almost nothing.
Fact not Fiction.... the game is not optimized at all to press your vid cards... and instead is smashing your CPU more then it should... give them some time cause this isn't an issue they can just hot fix... but it should be looked at soon i would hope
I wish people would stop posting topics like this. You have a monster card, and you think that it's ok to turn settings down in this game and still not get 60 fps?
I run a Core 2 Quad at 2.5 GHz and 4 Gigs of DDR3 along with a EVGA GeForce 8800 GTS that I have in there for Physics. I have to say that this system runs most games wonderfully. CO is a late 2009 release and I honestly don't expect to see these high numbers till the GeForce 300/400 series cards come out. Exactly what is causing the problems, I can't quite say. For a CPU test I can run Supreme Commander quite well without much slow down at all. Just adding effects for the GPU to render out like AA and Post Processing just increases its time with each frame by a lot which is making it stack up delays keeping frames from coming out that frequently. I am not quite sure what else they can do really to increase performance besides making sure that items arnt rendered out twice in various stages with the GPU. UT3 does a lot of rendering before you enter the game which takes a load away from the GPU by not having to calculate those in real time. I am not familiar with how CO's engine functions so I can't provide much on that.
One thing though, it seems it might take a while before CO's graphics get dated if what I think here is the case, if not then yea, it might be an optimization issue.
I wish people would stop posting topics like this. You have a monster card, and you think that it's ok to turn settings down in this game and still not get 60 fps?
sigh...
What I'd want to see is how the 360 version will run as they know what the hardware is. One problem designing for PC's is the amount of hardware variations. Just look at the amount of effort that goes into supporting Nvidia and ATI cards and all the issues that come up with that, its not easy. Would I like to run around at 60 FPS on high, yes, do I think its quite possible, I know its not with my current setup. Is it my own issue or a engine issue, only the Dev's would know.
One thing that would be interesting to know, is what type of performance they could get off their best machine at Cryptic and what it's specs are...
Sadly, CO is CPU bound; meaning your high end GPU is sitting around idle a lot of the time. Over time Cryptic will work out the kinks and you'll get to use all of you GPU, but until then think of CO as a way to let your GPU relax and enjoy watching your CPU sweat it out.
Sadly, CO is CPU bound; meaning your high end GPU is sitting around idle a lot of the time. Over time Cryptic will work out the kinks and you'll get to use all of you GPU, but until then think of CO as a way to let your GPU relax and enjoy watching your CPU sweat it out.
One has a 260 gtx
One has a 9800 gtx
One has a ATI Mobility Radeon HD 4650 graphics
Each of the comp's I have the graphics settings about the same did some tweaking in the advance settings Turning off the AA helped a little maybe something like 5-10 fps ran in the 20-30's for fps
But turning off the Post processin gave me about 20-30 fps more raising them to around 40-60 fps if I turned the Cartoon effect off the game does look really dull but with it turned on it still looks good.
...AA smooths out lines by averaging pixels around it so edges don't have that ladder effect where you could start seeing pixels, but rather are smoothed out.
Not to nitpick or anything, but if AA were to take averages from pixels around it, you would in fact only be adding a blur. AA works by taking sub-samples from within the pixel.
Not to nitpick or anything, but if AA were to take averages from pixels around it, you would in fact only be adding a blur. AA works by taking sub-samples from within the pixel.
Yeah - basically. AA is a "let's use a lot of memory to make things prettier" solution. The proper solution is to play at a ridiculous resolution. When we have monitors that measure in the 19000x12000 range, we won't need AA. Ahh how I look forward to that.
Yeah - basically. AA is a "let's use a lot of memory to make things prettier" solution. The proper solution is to play at a ridiculous resolution. When we have monitors that measure in the 19000x12000 range, we won't need AA. Ahh how I look forward to that.
Considering there's been absolutely zero improvement in monitor DPI/dotpitch since I started PC gaming 15 odd years ago, I'm going to guess that we're going to be in for a long long wait before we see the need for AA vanish. The only devices that seem to sport much higher DPI are laptops and other mobile devices. Software will end up having to be written to be independent of the DPI... running 1920x1200 on my 15.4" laptop can be downright painful at times as most applications either don't scale at all or scale very poorly.
Well, considering my comp runs like crud, in canada the ground freaquently disapeers whilst swinging, and then I get the problem for a minute where your character pops in and out of the floor. I have every set to give performance, no shadows etc.
When this comes to xbox, if it plays well, and you can use your character on xbox, I will probably move to that instead.
My comp though (don't s****** plz ):
AMD Anthon 64 (3200+)
ATI AGP HD2600 Pro
240GB Hardrive
Thats what it says on the case (HD 2600 Pro is added in).
Well apparently even if you have an awesome CPU and GPU you cannot run the game at 1920X1200 res. I have i7 965 and 2 x 295s, it should be able to run it just fine. But I do not get above 35-40FPS. Which is something that I should be getting atleast 60FPS in. Well the devs did say that they were working on it. So hopefully they will come out with a solution soon.
Comments
(9800GX2, 2.4 Quad, 4gig.)
I also have the comic-outline thingy turned off as well and even though i am able to play the game it looks like crap.
WIth the post-effects turned off i don't see any detail on my superpowers and they all look like the same blurry cloud.
I don't know if this is a good sacrifice to get the FPS i want.
I don't know if this tweak would create an improvement for Higher end cards/non SLI-modes; but it's just another tweak to try. If others are trying SLI configs, I used the City of Heros profile as a base, since at it heart, it's still the same GFX engine.
I was playing UT3 the other day, trying to figure out what that game is doing differently from CO, and why it had such a good frame rate looking at them side by side. UT3 for the most part your in a room, where CO you could view a whole city for one, and this post processing was added. Cutting that off and I started to see similar performance between the two games.
That is so messed up.. you should be able to run this game maxed out on everything at 60fps with that kind of card. they need to fix this bad...
People have been saying the game is putting more strain on the CPU than it should and the GPU is doing almost nothing.
Fact not Fiction.... the game is not optimized at all to press your vid cards... and instead is smashing your CPU more then it should... give them some time cause this isn't an issue they can just hot fix... but it should be looked at soon i would hope
Not if one has a crappy CPU.
sigh...
One thing though, it seems it might take a while before CO's graphics get dated if what I think here is the case, if not then yea, it might be an optimization issue.
What I'd want to see is how the 360 version will run as they know what the hardware is. One problem designing for PC's is the amount of hardware variations. Just look at the amount of effort that goes into supporting Nvidia and ATI cards and all the issues that come up with that, its not easy. Would I like to run around at 60 FPS on high, yes, do I think its quite possible, I know its not with my current setup. Is it my own issue or a engine issue, only the Dev's would know.
One thing that would be interesting to know, is what type of performance they could get off their best machine at Cryptic and what it's specs are...
Yea, that wouldnt be good at all then.
One has a 260 gtx
One has a 9800 gtx
One has a ATI Mobility Radeon HD 4650 graphics
Each of the comp's I have the graphics settings about the same did some tweaking in the advance settings Turning off the AA helped a little maybe something like 5-10 fps ran in the 20-30's for fps
But turning off the Post processin gave me about 20-30 fps more raising them to around 40-60 fps if I turned the Cartoon effect off the game does look really dull but with it turned on it still looks good.
Not to nitpick or anything, but if AA were to take averages from pixels around it, you would in fact only be adding a blur. AA works by taking sub-samples from within the pixel.
Yeah - basically. AA is a "let's use a lot of memory to make things prettier" solution. The proper solution is to play at a ridiculous resolution. When we have monitors that measure in the 19000x12000 range, we won't need AA. Ahh how I look forward to that.
a 3ghz quad core is not a crappy CPU
Would a 3GHz quad core count as a crappy CPU?
Cuz I have a GTX 285 and above CPU and I get a max of around 35 FPS on "Recommended" settings with shadows off.
Considering there's been absolutely zero improvement in monitor DPI/dotpitch since I started PC gaming 15 odd years ago, I'm going to guess that we're going to be in for a long long wait before we see the need for AA vanish. The only devices that seem to sport much higher DPI are laptops and other mobile devices. Software will end up having to be written to be independent of the DPI... running 1920x1200 on my 15.4" laptop can be downright painful at times as most applications either don't scale at all or scale very poorly.
When this comes to xbox, if it plays well, and you can use your character on xbox, I will probably move to that instead.
My comp though (don't s****** plz ):
AMD Anthon 64 (3200+)
ATI AGP HD2600 Pro
240GB Hardrive
Thats what it says on the case (HD 2600 Pro is added in).