Ive downloaded the latest patch but everytime I try to set to 16x Anti-alaising it crashes to desktop.
Ive tried on DX9 and also DX11 but still the same, strange as with other games I can use 16x ?
My system is fully updated with latest Ati radeon drivers (ATI Radeon HD5770 card).
Just tested with the 13.3 Beta 3 Ato drivers and still no good, also it requires me to check the files for data damage.
This is the message on crash,
"Fatal Error: Direct3D driver returned error code (E_INVALIDARG) while while creating a render target texture.
Technical Details: E_INVAILARG while creating 2D render traget texture size 1360 x 768 (RTEX_D24S8) ATI Radeon HD 7500 Series.
0
daed76Member, Neverwinter Beta Users, Neverwinter Guardian UsersPosts: 0Arc User
edited April 2013
that's because 16x aa is meant for nvidia cards. amd cards don't do 16x aa. max on amd cards is 8x unless you set it to edge detect and then it is either 12x or 24x.
I get this exact same bug and I'm using MSI twin frozr2 radeon 7850 2gig.
It runs fine on all other settings but as stated above, when switching from 8x AA to 16x, it crashes and spits out the exact same error code(s)
This is the ONLY game I get this error on.
My Skyrim install is over 130+ mods with custom 4096x4096 textures for photo-realisim and it runs smooth as butter, so I KNOW
my system will handle this. Windows API wont handle my custom install after about 4hours LoL... freakin skyrim install is at 33gigs now. I've converted it to the AD&D realm more or less, races look proper no more uglies etc...
Needless to say I'm a HUGE AD&D fan and would really like to run this game at max settings.
I get this same error on my Radeon HD 7660G, except only when I change from Direct3D 9 to Direct3D 11. Ive tried different resoulations, aspect ratios, AA's, textures, shadowing, happens in safe mode and regular game. only time a crash happens is when i try to change my AA settings from whatever it is defaulted to... to any other render. 4x I believe is the default, and I know for a fact I've tried 8x and 16x (I've only gotten the crash in Safe Mode so far, but that is due to only trying this in Safe Mode)
Would really love a fix for this... since other than GW2 this is my main game. and gfx rendering from Direct3D 9 to Direct3D 11 is like... Uber-Low Settings to High Settings.... and Sharp, clear picture VS. edgy, hazey, picture. I think we all know the answer to that one.
I also can't use 16x. It gives me this same error every time i try to change it. I have an MSI gt70. Intel hd4000 with Nvidia 675mx. BTW to anyone reading this AVOID DISCRETE CARDS lol. At this point I feel like every time i buy a new game I'm also buying a troubleshooting session!
I've discovered the problem here, and will have a fix. However, the end result is probably that anyone having this crash still will not have 16x multi-sampling, because your hardware does not support it.
The problem is a bug on our end interpreting the return value from the Direct3D API that we use to test for multisampling support on a given render target type at a particular sample count. We check the return value from the API, but were not checking if the API succeeded but told us the quality level count is zero, meaning that despite "succeeding" the API says the particular level of multisampling does not work. It appears to be a DX11-specific bug. This means that under DX11, I think on all hardware we would have reported it supported 16x. One my current dev system, with a GTX 650, the broken code indicates 16x was supported, but the DirectX caps viewer indicates my device only supports 8x.
This code may have been partially by design (I not terribly confident that's the case), as certain vendor-specific features require combinations of magic numbers to enable some special MSAA, like Nvidia's CSAA.
Anyway, if you are getting this fatal error when selecting any particular antialiasing mode, it probably means your hardware doesn't support it. We will be working on improving the UI clarity and safety here, though.
Comments
This is the message on crash,
"Fatal Error: Direct3D driver returned error code (E_INVALIDARG) while while creating a render target texture.
Technical Details: E_INVAILARG while creating 2D render traget texture size 1360 x 768 (RTEX_D24S8) ATI Radeon HD 7500 Series.
Still a bit strange that this is the only game thats ever crashed when Ive selected 16x ?
Dave
Keep us posted??
I tried this and indeed, I also crash when trying to enable x16 on DX11.
ATI Radeon HD3D Sapphire 7770
[ Support Center • Rules & Policies and Guidelines • ARC ToS • Guild Recruitment Guidelines | FR DM Since 1993 ]
It runs fine on all other settings but as stated above, when switching from 8x AA to 16x, it crashes and spits out the exact same error code(s)
This is the ONLY game I get this error on.
My Skyrim install is over 130+ mods with custom 4096x4096 textures for photo-realisim and it runs smooth as butter, so I KNOW
my system will handle this. Windows API wont handle my custom install after about 4hours LoL... freakin skyrim install is at 33gigs now. I've converted it to the AD&D realm more or less, races look proper no more uglies etc...
Needless to say I'm a HUGE AD&D fan and would really like to run this game at max settings.
Would really love a fix for this... since other than GW2 this is my main game. and gfx rendering from Direct3D 9 to Direct3D 11 is like... Uber-Low Settings to High Settings.... and Sharp, clear picture VS. edgy, hazey, picture. I think we all know the answer to that one.
The problem is a bug on our end interpreting the return value from the Direct3D API that we use to test for multisampling support on a given render target type at a particular sample count. We check the return value from the API, but were not checking if the API succeeded but told us the quality level count is zero, meaning that despite "succeeding" the API says the particular level of multisampling does not work. It appears to be a DX11-specific bug. This means that under DX11, I think on all hardware we would have reported it supported 16x. One my current dev system, with a GTX 650, the broken code indicates 16x was supported, but the DirectX caps viewer indicates my device only supports 8x.
This code may have been partially by design (I not terribly confident that's the case), as certain vendor-specific features require combinations of magic numbers to enable some special MSAA, like Nvidia's CSAA.
Anyway, if you are getting this fatal error when selecting any particular antialiasing mode, it probably means your hardware doesn't support it. We will be working on improving the UI clarity and safety here, though.
Dave