test content
What is the Arc Client?
Install Arc

S9: Graphic max settings now making my GPU/CPU struggle

1911131415

Comments

  • edgecrysgeredgecrysger Member Posts: 2,740 Arc User
    edited May 2014
    Bad news. For those who cant run the game at more than 20-25 fps, there is no solution. It seems that cryptic is not aware of this issue, more like they think its not their problem :mad:. Hell as i noticed, i dont think they even know about these posts (imagine my surprise and my allucination). So, i guess some of you will need to just forget about star trek online, i guess. I dont think they are even aware of the bad performance after season 9, not even about the undine battlezone and the terrible lag there, what makes me think that definitely cryptic wants to kill this game for good. Too bad, i really had hope to see STO improving and not the opposite. But i really dont find any explanation to cryptic's behaviour. This is llike 1 case in a million lol. Sigh.
  • ussprometheus79ussprometheus79 Member Posts: 727 Arc User
    edited May 2014
    For those who may not have read the maintenance notes thread for holodeck...appears they are in fact aware.

    http://sto-forum.perfectworld.com/showpost.php?p=17078321&postcount=74
    If you've come to the forums to complain about the AFK system, it's known to be bugged at the moment.
  • otowiotowi Member Posts: 600 Arc User
    edited May 2014
    th3xr34p3r wrote: »
    if Draw distance is at fault then I have a solution: take advantage of 11.2 with Tesselation LOD and Texture streaming based on what is on screen and what is not rather than using the old method of having EVERYTHING not on screen at peak texture resolution and LOD.

    This is another problem I see in the gaming industry, why stick to old band aid methods for games when the tech is already available for them to use and again same with the OS most if not all gamers since the release of the 5xx series of gpus are on them by now esp with the 20nm 8xx series around the bend, they need to pull their fingers outta their asses and stop with dx9/dx10 versions and focus on dx11/12 versions due to the better performance full stop.

    Ummm...

    DX 11/12 is not an option for many players. I tried, and my graphics card nearly commited suicide.

    Yes, DX11/12 might be better, but when a lot of the player base have older cards unable to even run DX11/12, and you focus only on DX11/12, you get a huge problem on your hands.

    That would mean those who can not use DX11/12 would need to go buy new graphics cards able to support the new DX versions...

    So no, not a good solution by any means, when the problem is with the game's rendering engine.

    When I get a CTD with an error message stating I have run out of video memory, there is something very wrong. I can run Battlefield 3 all day long at medium settings, no such error message, same with StarCraft II at almost max settings.

    That proves to me that there is a big problem with the way the rendering engine uses memory, and might also indicate a memory leak in the rendering engine...
  • edgecrysgeredgecrysger Member Posts: 2,740 Arc User
    edited May 2014
    For those who may not have read the maintenance notes thread for holodeck...appears they are in fact aware.

    http://sto-forum.perfectworld.com/showpost.php?p=17078321&postcount=74

    LOL, that annouce was posted after the same guy said they didnt know anything about those fps issues.. :cool: and other guy pointed him to a link with the post in question (ok, i think i am not going to put my hand in the fire..).

    And no, DX9 in the case of STO is always better. Why? because, for start, with DX11 you dont get any grahpical improvement so there is no point to use it. Period. Second, because STO works far worst in DX11, so the lag will be even worst using DX11.

    No DX12.. lol, just DX9 or DX11, those are the options.
  • aureleusaureleus Member Posts: 175 Arc User
    edited May 2014
    LOL, that annouce was posted after the same guy said they didnt know anything about those fps issues.. :cool: and other guy pointed him to a link with the post in question (ok, i think i am not going to put my hand in the fire..).

    And no, DX9 in the case of STO is always better. Why? because, for start, with DX11 you dont get any grahpical improvement so there is no point to use it. Period. Second, because STO works far worst in DX11, so the lag will be even worst using DX11.

    No DX12.. lol, just DX9 or DX11, those are the options.

    I have tested STO with both DX9 and DX11, performance wise there literally the same in my case. An yes STO Does use a certain amount of Tessellation for those of you questioning it. I know this cause details on my ships popped out ALOT more when I flipped to DX11 after I got my 270X, before I was using an HD4890. Sure its not on the same level as most newer titles, but do you really expect Cryptic to suddenly pop out a Giant patch that does nothing but add in more DX11 related content C_C. You imagine what that would do to STO atm....lolz

    So no, there's not a ton of DX11 in STO atm, just some. There need to update the engine more before were see anything ground breaking in STO >.>.

    Also DX11 Would be a good place to put more resources into, why ? Better support for shaders, textures, lighting, etc. You name it and its almost definite that what ever DX11 has is better then what DX9 has. Sure keep DX9 as a legacy format for those who just don't have the hardware, but I wouldn't rule out the use of either DX9 or DX11, DX10 how ever...no we can't have that sorry...just..no... >.>.

    Something else that most ppl are not addressing is the fact that STO is Still a game that is based around a 32 bit engine, which means that it is Very limited, especially when you start packing new content into it. If it were a single player game that wouldn't have been really an issue, but since its a MMO that then becomes a problem later down the road.

    Well Cryptic, were at the end of that road and I think its time that STO got a well deserved Overhaul...Seriously.
  • ussprometheus79ussprometheus79 Member Posts: 727 Arc User
    edited May 2014
    Second, because STO works far worst in DX11, so the lag will be even worst using DX11.

    Not the case, I noticed no performance difference whatsoever in either DX9 or DX11.

    Be interesting if we hear back from the devs through Smirk or otherwise.
    If you've come to the forums to complain about the AFK system, it's known to be bugged at the moment.
  • th3xr34p3rth3xr34p3r Member Posts: 0 Arc User
    edited May 2014
    otowi wrote: »
    Ummm...

    DX 11/12 is not an option for many players. I tried, and my graphics card nearly commited suicide.

    Yes, DX11/12 might be better, but when a lot of the player base have older cards unable to even run DX11/12, and you focus only on DX11/12, you get a huge problem on your hands.

    That would mean those who can not use DX11/12 would need to go buy new graphics cards able to support the new DX versions...

    So no, not a good solution by any means, when the problem is with the game's rendering engine.

    When I get a CTD with an error message stating I have run out of video memory, there is something very wrong. I can run Battlefield 3 all day long at medium settings, no such error message, same with StarCraft II at almost max settings.

    That proves to me that there is a big problem with the way the rendering engine uses memory, and might also indicate a memory leak in the rendering engine...

    If you have anything lower than a Radeon HD7790 2GB or Nvidia GTX650Ti Boost 2GB by Q1 2015 you're gonna have a bad time with new games in the future, because when the games release then, the cards that can only use DX9/10 will be totally obsolete.

    As for VRAM issues this is what dx11.2 specifically addresses with tiled resources.

    A 64bit rendering engine with a proper dx11/12 implementation will run better then a 32bit rendering engine with dx9/10 and a half asses dx11 implementation.

    If you are not willing to upgrade your pc's gpu as a gamer every 3.5-4 years then to get the most out of the games coming out with a set budget then why game at all on the pc.
    [SIGPIC]Click to visit Subspace-Radio[/SIGPIC]
    Twitter | Blog | Original Join Date: Dec 2007 | Gaming Setup | Raptr Profile | Gamer DNA
    The opinions expressed in my posts are my own views and do not reflect on any other entity(s) or person(s) I may or may not represent at the time.
  • edgecrysgeredgecrysger Member Posts: 2,740 Arc User
    edited May 2014
    No, for god sake, STO doesnt use tesellation at all dude.. that is an advanced rendering method to render new generation games, things like hair, grass, and really tiny objects... and its post STO, not prior. :D

    And in my case, i noticed a slighty increase in performance using DX9, and as it should be since STO is not optimized at all and thinking straight it should work worst in DX11.
  • edgecrysgeredgecrysger Member Posts: 2,740 Arc User
    edited May 2014
    th3xr34p3r wrote: »
    If you have aything lower than a Radeon HD7790 2GB or Nvidia GTX650Ti Boost 2GB by Q1 2015 you're gonna have a bad time with new games in the future, because when the games release then, the cards that can only use DX9/10 will be totally obsolete.

    Any videocard of 4-5 years old have support for DX11. It should. And no, for the next generation games you will need A LOT more than a gtx650 xDD, lets say ten times better. We are talking about minimum requirements of, for example 8 gb videoram, 64 gb system ram, eight core processors, and things like that. Right now, with a gtx650ti (the one i have) you can play at anything at 1080p with all settings maxed, with the exception of things like AA 16x and tesellation at high levels. But right now, a 650gtx can run evertything. Mine does. But by the year 2015, things are going to be, well.. different xD. :D
  • th3xr34p3rth3xr34p3r Member Posts: 0 Arc User
    edited May 2014
    Any videocard of 4-5 years old have support for DX11. It should. And no, for the next generation games you will need A LOT more than a gtx650 xDD, lets say ten times better. We are talking about minimum requirements of, for example 8 gb videoram, 64 gb system ram, eight core processors, and things like that. Right now, with a gtx650ti (the one i have) you can play at anything at 1080p with all settings maxed, with the exception of things like AA 16x and tesellation at high levels. But right now, a 650gtx can run evertything. Mine does. But by the year 2015, things are going to be, well.. different xD. :D


    *facepalm* why do people FAIL to read what I stated.. I said LOWER THAN, as for VRAM we already have 4-6GB cards if you want to pay extra and that's mainly for after you max out your in-game settings and want to play at 4K or greater with/without surround.

    The 20nm 8xx's are going to be overall worse than a 780 due to the die issue hence why you want to wait till the 16/14nm die cut cards are released which imo will not happen any sooner than Q1 2016.

    As for system ram, you do realize how cheap that is now, right? and no unless you are doing heavy ram intensive tasks (like game streaming, video render etc), you want at least 16GB for gaming. Same for the cpu you don't need anything more than a quad core for general gaming.

    As for AA the higher your screen resolution the less AA you need to use due to the higher pixel density.
    [SIGPIC]Click to visit Subspace-Radio[/SIGPIC]
    Twitter | Blog | Original Join Date: Dec 2007 | Gaming Setup | Raptr Profile | Gamer DNA
    The opinions expressed in my posts are my own views and do not reflect on any other entity(s) or person(s) I may or may not represent at the time.
  • aureleusaureleus Member Posts: 175 Arc User
    edited May 2014
    No, for god sake, STO doesnt use tesellation at all dude.. that is an advanced rendering method to render new generation games, things like hair, grass, and really tiny objects... and its post STO, not prior. :D

    And in my case, i noticed a slighty increase in performance using DX9, and as it should be since STO is not optimized at all and thinking straight it should work worst in DX11.

    The reason your GTX 650 lags a bit with DX11 is because its on the same performance level as say a GTX 460 *Based on Core Configs* so your going to notice a performance drop if your GPU can't handle the added loads :/. Try toning your shadows to medium it helps ALOT lolz.

    Tessellation is used primarily for advanced land masses and objects *models/meshes etc*, other features DX11 has also include advanced shadowing, lighting and shaders.

    Take for example how rope looks in some games, it usually doesn't have Any Depth to it when up close. Bring DX11 into the picture and it completely changes it, now instead of the rope being 2D its now true 3D and has the detail needed to give it a very realistic appearance. No more 1 sided textures and models *in the ropes case*.

    DX11 focus's Heavily on using A Lot more vertices then previous DX's, and because of this there is Much more detail in objects. This does not necessarily mean less performance, only reason that would come into play is when used with a Low end GPU that just doesn't have the raw power for it.
  • edgecrysgeredgecrysger Member Posts: 2,740 Arc User
    edited May 2014
    aureleus wrote: »
    The reason your GTX 650 lags a bit with DX11 is because its on the same performance level as say a GTX 460 *Based on Core Configs* so your going to notice a performance drop if your GPU can't handle the added loads :/. Try toning your shadows to medium it helps ALOT lolz.

    Tessellation is used primarily for advanced land masses and objects *models/meshes etc*, other features DX11 has also include advanced shadowing, lighting and shaders.

    Take for example how rope looks in some games, it usually doesn't have Any Depth to it when up close. Bring DX11 into the picture and it completely changes it, now instead of the rope being 2D its now true 3D and has the detail needed to give it a very realistic appearance. No more 1 sided textures and models *in the ropes case*.

    DX11 focus's Heavily on using A Lot more vertices then previous DX's, and because of this there is Much more detail in objects. This does not necessarily mean less performance, only reason that would come into play is when used with a Low end GPU that just doesn't have the raw power for it.

    No, my gtx 650 doesnt lag, i never said that?? i said, that if i activate vsync, i suffer lag spikes. But i dont have vsync activated, i play at 60-70 fps all the time. But using DX11, the only difference was that just minor spikes lowering to 45-50 fps, but thats all. But playing at 70fps all the time, i am able to notice spikes of 15 fps. I use DX11 in other games, regular games, and my videocard performs amazingly well.

    And yes, tesellation is used for that as well, and a lot of things, but primary for rendering complex small objects like hair and grass. That was the primary goal when it was developed. Btw, it comes to my mind "Tomb Raider", a clear example of the use of tesellation effects. And again, no, tesellation is not used in STO. DX11 has no improvements watsoever over DX9 in this case. I played with DX11 until a couple of months ago, and nothing, nada, no difference at all. Well, im wrong, the only difference i noticed was the inclusion of 2 more antialiasing modes in the options. Thats all.

    What STO needs is a revamped engine, period. :P
  • th3xr34p3rth3xr34p3r Member Posts: 0 Arc User
    edited May 2014
    No, my gtx 650 doesnt lag, i never said that?? i said, that if i activate vsync, i suffer lag spikes. But i dont have vsync activated, i play at 60-70 fps all the time. But using DX11, the only difference was that just minor spikes lowering to 45-50 fps, but thats all. But playing at 70fps all the time, i am able to notice spikes of 15 fps. I use DX11 in other games, regular games, and my videocard performs amazingly well.

    And yes, tesellation is used for that as well, and a lot of things, but primary for rendering complex small objects like hair and grass. That was the primary goal when it was developed. Btw, it comes to my mind "Tomb Raider", a clear example of the use of tesellation effects. And again, no, tesellation is not used in STO. DX11 has no improvements watsoever over DX9 in this case. I played with DX11 until a couple of months ago, and nothing, nada, no difference at all. Well, im wrong, the only difference i noticed was the inclusion of 2 more antialiasing modes in the options. Thats all.

    What STO needs is a revamped engine, period. :P

    Finally something we all agree on and been saying pretty much the entire time.
    [SIGPIC]Click to visit Subspace-Radio[/SIGPIC]
    Twitter | Blog | Original Join Date: Dec 2007 | Gaming Setup | Raptr Profile | Gamer DNA
    The opinions expressed in my posts are my own views and do not reflect on any other entity(s) or person(s) I may or may not represent at the time.
  • th3xr34p3rth3xr34p3r Member Posts: 0 Arc User
    edited May 2014
    Well that was interesting, just updated to driverset 337.81 Beta and even with some of the combat dropping down into 24ish range for it felt smoother vs 337.50 Beta in Vicious Cycle.

    Did not notice I dropped into the 20's till I looked on my G13 display.
    [SIGPIC]Click to visit Subspace-Radio[/SIGPIC]
    Twitter | Blog | Original Join Date: Dec 2007 | Gaming Setup | Raptr Profile | Gamer DNA
    The opinions expressed in my posts are my own views and do not reflect on any other entity(s) or person(s) I may or may not represent at the time.
  • nyxadrillnyxadrill Member Posts: 1,242 Arc User
    edited May 2014
    A quick question guy's,

    (I'm too lazy to google :P )

    What exactly does VSync do? I've seen the option but never thought to play with it.
    server_hamster6.png
  • th3xr34p3rth3xr34p3r Member Posts: 0 Arc User
    edited May 2014
    nyxadrill wrote: »
    A quick question guy's,

    (I'm too lazy to google :P )

    What exactly does VSync do? I've seen the option but never thought to play with it.

    It locks the gpu to your display max refresh rate and any extra frames once on are not rendered.

    In nvidia there are two options there is the old always on vsync and the newer adaptive that turns off vsync when you are not hitting your refresh rate max and turns it on when you do to help mitigate screen tearing.

    The next gen of display's once more avaliable will support a technology called gsync which turns vsync on its head by having the display sync up to your gpu instead to further eliminate frame hitching and screen tear.
    [SIGPIC]Click to visit Subspace-Radio[/SIGPIC]
    Twitter | Blog | Original Join Date: Dec 2007 | Gaming Setup | Raptr Profile | Gamer DNA
    The opinions expressed in my posts are my own views and do not reflect on any other entity(s) or person(s) I may or may not represent at the time.
  • nyxadrillnyxadrill Member Posts: 1,242 Arc User
    edited May 2014
    th3xr34p3r wrote: »
    It locks the gpu to your display max refresh rate and any extra frames once on are not rendered.

    In nvidia there are two options there is the old always on vsync and the newer adaptive that turns off vsync when you are not hitting your refresh rate max and turns it on when you do to help mitigate screen tearing.

    The next gen of display's once more avaliable will support a technology called gsync which turns vsync on its head by having the display sync up to your gpu instead to further eliminate frame hitching and screen tear.

    Ah! Thank you :)
    server_hamster6.png
  • aureleusaureleus Member Posts: 175 Arc User
    edited May 2014
    th3xr34p3r wrote: »
    It locks the gpu to your display max refresh rate and any extra frames once on are not rendered.

    In nvidia there are two options there is the old always on vsync and the newer adaptive that turns off vsync when you are not hitting your refresh rate max and turns it on when you do to help mitigate screen tearing.

    The next gen of display's once more avaliable will support a technology called gsync which turns vsync on its head by having the display sync up to your gpu instead to further eliminate frame hitching and screen tear.

    Only issue with Gsync is that it's a proprietary based hardware for Nvidia GPUs Only, meaning if you want it your forced to use one of there newer video cards to make it work :/. Personally I am getting tired of all this proprietary bs from them lolz.
  • captainoblivouscaptainoblivous Member Posts: 2,284 Arc User
    edited May 2014
    aureleus wrote: »
    Only issue with Gsync is that it's a proprietary based hardware for Nvidia GPUs Only, meaning if you want it your forced to use one of there newer video cards to make it work :/. Personally I am getting tired of all this proprietary bs from them lolz.

    Not to mention that the last I heard, the monitor needs to have certain hardware installed as well.
    I need a beer.

  • ussprometheus79ussprometheus79 Member Posts: 727 Arc User
    edited May 2014
    Not to mention that the last I heard, the monitor needs to have certain hardware installed as well.

    I wasn't too sure on that point myself.
    If you've come to the forums to complain about the AFK system, it's known to be bugged at the moment.
  • captainoblivouscaptainoblivous Member Posts: 2,284 Arc User
    edited May 2014
    I wasn't too sure on that point myself.

    I was right. Here it is. It needs a piece of hardware to be fitted to the monitor, but to my knowledge only one monitor is able to mount such hardware and even then it needs to be retrofitted either by the user or the retailer.
    I need a beer.

  • th3xr34p3rth3xr34p3r Member Posts: 0 Arc User
    edited May 2014
    I was right. Here it is. It needs a piece of hardware to be fitted to the monitor, but to my knowledge only one monitor is able to mount such hardware and even then it needs to be retrofitted either by the user or the retailer.


    It's still early to recommended it as a must get item for a PC gamer but it's valid enough to at least keep an eye on it from my point of view when it comes to new tech.
    [SIGPIC]Click to visit Subspace-Radio[/SIGPIC]
    Twitter | Blog | Original Join Date: Dec 2007 | Gaming Setup | Raptr Profile | Gamer DNA
    The opinions expressed in my posts are my own views and do not reflect on any other entity(s) or person(s) I may or may not represent at the time.
  • edgecrysgeredgecrysger Member Posts: 2,740 Arc User
    edited May 2014
    I was right. Here it is. It needs a piece of hardware to be fitted to the monitor, but to my knowledge only one monitor is able to mount such hardware and even then it needs to be retrofitted either by the user or the retailer.

    Bleh, it doesnt matter. In the end, they will start making computer screens with that tech included. Because it is tech, and it is the evolution. Unless it is a fail, something that we dont know yet.
  • aureleusaureleus Member Posts: 175 Arc User
    edited May 2014
    Bleh, it doesnt matter. In the end, they will start making computer screens with that tech included. Because it is tech, and it is the evolution. Unless it is a fail, something that we dont know yet.

    There only make certain monitors with that hardware, if company's were to make that on all there monitors as a proprietary piece of hardware they'd actually loose business. People do not like Linked proprietary hardware just to get 1 benefit out of it. So instead of paying say $300 for the video card and using its built in Vsync, there trying to force you to pay that $300 + what ever the G-Sync enabled monitor costs, so say $600 total. Not many ppl are going to like that just to get G-Sync.

    With that said I'd be better off just keeping my current monitor and getting a super beefed up video card at that price lolz.
  • ussprometheus79ussprometheus79 Member Posts: 727 Arc User
    edited May 2014
    th3xr34p3r wrote: »
    Finally something we all agree on and been saying pretty much the entire time.

    A new engine would indeed be nice. Maybe I could do with a new one as well. My 2.0 D4D just aint cutting it anymore. :D
    If you've come to the forums to complain about the AFK system, it's known to be bugged at the moment.
  • ddemlongddemlong Member Posts: 294
    edited May 2014
    Is it just me or since the patch is everyone's screen also just randomly flashing?
    I use to do 100K DPS, but then I took an arrow to the knee.


    Your Ramming Speed III deals 242658 (243540) Kinetic Damage (Critical) to you.
  • ussprometheus79ussprometheus79 Member Posts: 727 Arc User
    edited May 2014
    ddemlong wrote: »
    Is it just me or since the patch is everyone's screen also just randomly flashing?

    Not noticed. Does it flash blank or still leave stuff on the screen like some of the UI elements for example.
    If you've come to the forums to complain about the AFK system, it's known to be bugged at the moment.
  • philosopherephilosophere Member Posts: 607 Arc User
    edited May 2014
    Nothing in latest Tribble patch notes, and still no further word since Captain Smirk said he would look into it.

    Getting a bit miffed.... anything from them would be apprecitated... :mad:
    Are we there yet?
  • mouertemouerte Member Posts: 0 Arc User
    edited June 2014
    Hey all.
    So using a GTX 670 4GB been running like TRIBBLE since S9.
    Anyhow got the ops to buy one more for more or less cheap
    to run SLI so I go hell why not got it installed it working fine.
    But did it make a dif in STO hell no lol still the same bad FPS drops and lag sub 30fps in stfs
    only place it make any deferens is in the load screen were is was 450fps it now
    went to 670-690 wow thx big help DURP :P
  • captainoblivouscaptainoblivous Member Posts: 2,284 Arc User
    edited June 2014
    mouerte wrote: »
    Hey all.
    So using a GTX 670 4GB been running like TRIBBLE since S9.
    Anyhow got the ops to buy one more for more or less cheap
    to run SLI so I go hell why not got it installed it working fine.
    But did it make a dif in STO hell no lol still the same bad FPS drops and lag sub 30fps in stfs
    only place it make any deferens is in the load screen were is was 450fps it now
    went to 670-690 wow thx big help DURP :P

    That's because STO is far heavier on CPU usage than GPU usage or VRAM.

    Derp indeed.
    I need a beer.

Sign In or Register to comment.