test content
What is the Arc Client?
Install Arc

Neverwinter crashing and flickering display. Can this game understand Nvidia CUDA?

mwkmwk Posts: 85Member Arc User
edited September 2018 in Peer to Peer Tech Forum
Just started Neverwinter, but for some reason I keep crashing at the ? um Protector Enclave Vault. I'm level 4.

I can play Neverwinter at MAX graphics as it seems, but sometimes will get some random screen flickers in some areas. The game runs smooth to run at MAX without any problems and I can run around going weeeeeeeeeeeeee. Still there seems to be a problem of the game utilizing my card and it seems to flicker the display in some areas, no matter what I use for a setting. FPS seems not the problem for me.

+ASUS N61JV-X4

+Nvidia Geforce GT 325M Cuda 1G RAM with Optimus Technology (switches from Intel graphics) [White light-Nvidia][Blue light-Intel]
I know it's using Nvidia when I play Neverwinter.

+Intel Core i5-450M 2.4GHz with Turbo Boost Technology up to (2.53GHz/2.6GHz just about)

+8G of Ram DDR3, which is enough to play this game.

Neverwinter is not utilizing some functions for Nvidia cards it seems! Nvidia cards constantly have problems compared to others in this game it seems. That stated, I've seen on the internet that an option seems to help in some cases with the texture problems. Go into game options > Advanced > and disable multi-core rendering. So I might try that, however CUDA is multi-core function.

I tried no anti-aliasing and the flicker went away. I don't understand why turning off anti-aliasing will solve the issue of flickering and it seems I have the ability to use it at times. I'm positive that there needs to be C++ coding to be implemented to run properly. I just researched my card by detail and by research. CUDA was the first thing I searched, so perhaps implementing the C++ code from this site will help eliminate some issues of the flickering?

From the site, it actually tells how to enter the texture coding, which can be found here.
https://en.wikipedia.org/wiki/CUDA

This C++ coding needs to be implemented for Neverwinter to understand CUDA and to run properly with Nvidia graphics cards.

By just researching, it seems that this game is advertised for AMD/ATI Radeon, and the game needs improvement for Nvidia. Actually it's right in front of the face on this site!

Also I never understood of why some games have to be allowed through a firewall? So that might explain some crashing?

Also this site is bogus! I used to play TERA Online with no problems on MAX settings and it stated I couldn't play it.

https://www.game-debate.com/games/index.php?g_id=1199&canMyGpuRunIt=Neverwinter

I should be running better than what is happening to me and my computer blows out hot air like a dragon. Just stating something that needs to be implemented for Neverwinter to work with Nvidia CUDA cores.

Also no offense, but whose mind was it for Neverwinter to patch files in certain areas? That is a problem perhaps and could be a culprit to some crashing!?

I'm just trying to solve problems for everyone, including myself. Mostly myself, because it's happening to me LOL.

Comments

  • mwkmwk Posts: 85Member Arc User
    edited September 2018
    If you are experiencing game crashes after changing your in-game graphics settings and cannot change them back, or have recently patched the game and are noticing graphical issues when playing which cannot be fixed by changing the in-game settings, please try the following:
    1. Locate the GamePrefs.pref file in the game installation folder. For Neverwinter, check the \Neverwinter_en\localdata folder.
    2. Delete the GamePrefs.pref file.
    3. Restart the game.
    My comment. What happened and what was done to create this issue, once a patch came into effect?

    https://support.arcgames.com/app/answers/detail/a_id/4261/kw/neverwinter%20freezing

    PLEASE FIX IT




  • mwkmwk Posts: 85Member Arc User
    edited September 2018
  • Without meaning to sound like a HAMSTER, you do know that the performance issues are CPU, rather than GPU based? The difference from say GTX 1060 to 1080 are tiny compared to CPU performance differences. Eg, a slight OC on the CPU yielded me about 20% more FPS. Stuttering can be caused by other issues tho.

    And if you wanted actual GPU optimization, you'd overhaul the entire thing as it is a shorter process on Python. The performance gains are minimal tho. And no, it's not "optimized" for either GPU brand.

    If they wanted a proper API they'd work on DX12 or Vulkan.

    And besides, you think they can just copy paste that code somewhere, maybe change a value or two and that's it.

    That's a big no-no. Even if the original designer used a remotely similar way to render stuff, they'd have to modify a great majority of the game to work for that specific code.



    (also, you just now realized that gamedebate is a piece of HAMSTER? waov).
  • mwkmwk Posts: 85Member Arc User
    edited September 2018
    Well, as of now it does seem there is room for optimization to work with Nvidia CUDA graphics cards. My card can handle a bit more settings. I over exaggerated by stating max settings though. My card will display extreme settings, just not around many people, and many animations probably for this game perhaps. Tera Online I could do complete MAX, so something more. So what I'm stating, something more visual quality than what the game is accepting or allowing me, and without crashing. For now I have something decent quality to at least play the game without crashing later on; doesn't look that bad. Testing higher settings, I did have a Direct3D error from the error crash thingy after fighting enemies for awhile, so that means it's not accepting some settings that I could add; capability of using higher settings it seems. Testing these settings, it played alright, then froze, and crashed. Seems rather ridiculous that I can't use any shadows and other settings without crashing. I played Skyforge and I could add a bit more settings. So with that stated, this game is quite similiar by graphics, so there is room for improvements. At least I can have water visuals at MAX without crashing lol, so I have the ability to do more settings.
    Post edited by mwk on
  • mwkmwk Posts: 85Member Arc User
    edited September 2018
    Not really any lag at higher max settings, but more not accepted visuals, and I crash after awhile of gameplay. That's what I meant to clarity of explanation of what's happening. Since I played Skyforge, that game allowed me to use 4x-8x anti-aliasing. I can't even use it on this game, without crashing, and not flickering in some areas.
  • mwkmwk Posts: 85Member Arc User
    edited September 2018
    Just wanted to note that I can upper the visuals. I'm using 2X anti-aliasing and 2x anisotropic with low shadows for now. I could go higher, however the shadows is what makes the quality of render, which is low for now. There is hardly a noticeable difference of crisper image between 2X-4X anti-aliasing and 2X-4X anisotropic as tested. I could turn it up more, however setting the shadows to high is what makes the game crash, at least for me, after awhile.

    Changing the shadows is what makes the render quality for this game. I have it just enough for enjoyment, but at a setting for performance. I will test some higher settings later, but for now it's fine. FPS, it's smooth and enjoyable.

    So if anyone has huge crashing problems, shadows is what you have to mess with to lower settings.
Sign In or Register to comment.