test content
What is the Arc Client?
Install Arc

2 year old nvidia card beats new dx 11 ati card

SystemSystem Member, NoReporting Posts: 178,019 Arc User
Post edited by Unknown User on
«1

Comments

  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited February 2010
    i just wrote a thread about graphics lmao geforce 9 series **** poor omg!! lmao
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited February 2010
    Yea, STO does not like ATI. I can only hope a new patch will fix that
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited February 2010
    odingrey wrote: »
    Yea, STO does not like ATI. I can only hope a new patch will fix that

    No doubt ATI will tweak their drivers to get better results, they always do... unless it's a crossfire problem (which it's not, but) in which they will just ignore it like they have Oblivion and Fallout 3.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited February 2010
    matthmaroo wrote:

    Sad little fanboi, nobody cares that an old chip massivly over clocked and over priced gets a high score.... YAWN.

    Problem is, what score will the nvidia card get on Colin McRae's Dirt 2 on the pc? 0FPS because Nvidia do not have any DX11 part ready. They've only just mastered DX10.1 by forcing an OC's gpu to do a lot of software calculations becuase the hardware can't do it.


    Whats next, you going to try and convince us that the PS3 is still faster tahn a PC?
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited February 2010
    woo glad i have the GTX260 but i have the extreme overclocked version from point of view :D
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited February 2010
    show me a nvidia card that can use this benchmark 2 its fullest, http://unigine.com/download/
    nvidia just got pwned.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited February 2010
    Wow, results like that are usually indicative of driver level problems with a game.

    I run STO at the recommended preset with AA at 4x and AF at 8x at 1280x1024 on my 2x 5750 setup and I never drop below 40fps even on 1 card.

    Really makes me wonder if those guys were...

    A. Using the 10.2 Catalysts that came out the other day.

    B. Able to conquer the 3D clock fiasco where ATi cards still aren't clocking properly in 3D (Even in 10.2). I fixed it by making custom overclocking profiles with correct voltage settings (0.990 for low clocks, 1.110 for high clocks) in ATi Tray Tools and then saving those profiles as '2D' and '3D' and 'LP' (Low Power) and then setting them to hotkey combos for easy use (Ctrl+F10,F11, and F12) I can now switch between 137/300,400/900, and 700/1150 and have it stay clocked to what its supposed to be, without having driver restart errors or clocks reverting to different settings mid game.

    Although this benchmark seems to point at the tomshardware guys using Catalysts as they are, and not doctoring them up the way I have learned to over the past month. That reason is exactly why right now ATi is in a tight spot. They need to improve their drivers and fix core functions. Not every person has it in them to discover a solution on their own, and right now nVidia drivers work properly without the use of 'user fixes'.

    However, if nVidia hadn't had such bad yields on their Fermi and been forced to delay it, they would probably be dealing with driver problems as well. Every new generation of cards has their share of bugs. Hell my brand new 6800GT back in 2004 boasted a Purevideo feature that never even worked properly and was never fixed.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited February 2010
    Yeah, it's a damn shame, not only do ATI cards run poorly in STO but dynamic lighting still hasn't been fixed to not cause crashes. :(
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited February 2010
    In fairness that article was written before the 10.2 drivers where out. But then the 196.21 drivers where also not out.

    I tried STO on a 5670 with the 10.2 drivers and turning it down some was able to play fine.

    Who the heck plays on a 9600 gt? Guess I can't pick on that one I tried it on a 5670. but it seems like any half serious gamer is going to be using something in the 8800/9800/250 or better models.

    Correct me if I'm wrong but isn't the gtx 285 a better comparison to the 5850? The 260 is like $100 cheaper than the 5850 and a 285 is maybe $30-$40 more than a 5850.

    if your going to have a "shoot out" on a title at least compare apples to apples.

    I'm not bashing ATI, cost vs performance they are hard to beat but I do prefer Nvidia.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited February 2010
    I have a 5770 and my framerates are better than that. I have everything maxed out except for dynamic lighting of course.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited February 2010
    ELITE-Kaos wrote: »
    Sad little fanboi, nobody cares that an old chip massivly over clocked and over priced gets a high score.... YAWN.

    Problem is, what score will the nvidia card get on Colin McRae's Dirt 2 on the pc? 0FPS because Nvidia do not have any DX11 part ready. They've only just mastered DX10.1 by forcing an OC's gpu to do a lot of software calculations becuase the hardware can't do it.


    Whats next, you going to try and convince us that the PS3 is still faster tahn a PC?

    Most developers are working on DX10 games at the moment and many still use 9 as their primary platform. By grabbing a high end ATI DX11 card so early you may have just guaranteed you will be obsolete quicker. By the time dx11 games become mainstream you will pobably be wanting a faster card ATI or Nvidia.

    I'll hold off of any further discussion on the topic until at least March 26th.

    http://www.geek.com/articles/games/nvidia-directx-11-cards-launch-march-26th-20100223/
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited March 2010
    CapnScragg wrote: »
    Most developers are working on DX10 games at the moment and many still use 9 as their primary platform. By grabbing a high end ATI DX11 card so early you may have just guaranteed you will be obsolete quicker. By the time dx11 games become mainstream you will pobably be wanting a faster card ATI or Nvidia.

    I'll hold off of any further discussion on the topic until at least March 26th.

    http://www.geek.com/articles/games/nvidia-directx-11-cards-launch-march-26th-20100223/

    Some of us have to learn how to program with DX11, thats why some of us buy the cards or have the purchased for us ;)
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited March 2010
    I'm running an 5850 and have all settings maxed with 8xAA and smooth as butter, the CPU also comes into effect some here, I'm running a AMD Phenom x4 II 965, but it's OCed to 4.2Ghz.

    Video card is all stock no OCing.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited March 2010
    A.T.I. has it's pluses but heat has always been excesive in the olded A.T.I. cards. Their newer cards are to run cooler, but I am not interested at this time to run water blocks to keep things from cooking. NVIDIA has it share of problems. Both companies talk like there hurting for cash. And we are still waiting for the 300 sereis from NVIDIA that was not released. So some time in April we are to see this latest card released.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited March 2010
    Starlanced wrote:
    I'm running an 5850 and have all settings maxed with 8xAA and smooth as butter, the CPU also comes into effect some here, I'm running a AMD Phenom x4 II 965, but it's OCed to 4.2Ghz.

    Video card is all stock no OCing.

    Actually as long as you have a dual core, the CPU makes no difference, at all.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited March 2010
    That's why I got the HD4670. Maybe it's not top of the line, but I run above max settings at 30fps. You just have to stay away from budget chipsets.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited March 2010
    eklipze wrote:
    That's why I got the HD4670. Maybe it's not top of the line, but I run above max settings at 30fps. You just have to stay away from budget chipsets.

    There is a sweet spot one or two cards down from the high end that I like. One of the main selling points of ATI has always been much better price vs performance compared to Nvidia. I would think that price/performance sweet spot would be very popular in ATI circles.

    I'm just hoping Nvidia does not shoot themselves in the foot and price their new cards so high people just snub their nose at them and buy ATI. Guess we will see in a few weeks.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited March 2010
    Dx 11 games are already here. I've been playing STALKER Call of Pripyat a lot and I plan on getting the new avp game soon. I've also had Battleforge for awhile now, but I haven't played it much.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited March 2010
    More FPS for NVidia, but ATI looks much better.

    Oh, btw, in my pc is a GTX285, so dont call me ATI fanboy ;)
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited March 2010
    I've already pointed out in another thread how these results are bullcrap. Only a complete noob points to them to prove anything. The tests were run during OB. Not post-launch. Nor with recent ATI drivers. Lame attempt.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited March 2010
    don't call then benches TRIBBLE, they're not. Now it may not be apples to apples, but the fact that an older series video card can get better FPS is nothing, overall quality is what you're looking for. And Just because it's a DX 11 card doesn't mean it will run DX 10/9 games better. the API written is still the same no matter what card you're using. The bottom line is at the maufacture, where the drivers come from. the game is optimized for nVidia hardware, there are some that are optimized for ATI, and each will run differently on each hardware set. Untill there is a unified drivers set for all graphics cards, you will always have this issue. each company will have a game vendor that wants to stick the particular logo on that game, all it comes down to is money.

    Plain and simple, ATI has always had a better price to perfomance ratio, but they have had thier downs too.
    nVidia was a leader in the graphics market for a time, but now it's going the other way, because they don't have DX 11 hardware at this time. That's what's being said, and what's accepted, but the track record from both vendors is a cycle, one will hold the top, and then they switch. it's an endless cycle since the voodoo days with 3dfx.

    And I say the P/P ratio for ati, because even their "budget" cards will run most games decently where nvidia has always played the, "you want to play this <insert game title>, then you need this <insert card model>" for the best performance.

    All my opinion by the way, so take it as it is. :)
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited March 2010
    Neocorteqz wrote: »
    don't call then benches TRIBBLE, they're not. Now it may not be apples to apples, but the fact that an older series video card can get better FPS is nothing, overall quality is what you're looking for. And Just because it's a DX 11 card doesn't mean it will run DX 10/9 games better. the API written is still the same no matter what card you're using. The bottom line is at the maufacture, where the drivers come from. the game is optimized for nVidia hardware, there are some that are optimized for ATI, and each will run differently on each hardware set. Untill there is a unified drivers set for all graphics cards, you will always have this issue. each company will have a game vendor that wants to stick the particular logo on that game, all it comes down to is money.

    Plain and simple, ATI has always had a better price to perfomance ratio, but they have had thier downs too.
    nVidia was a leader in the graphics market for a time, but now it's going the other way, because they don't have DX 11 hardware at this time. That's what's being said, and what's accepted, but the track record from both vendors is a cycle, one will hold the top, and then they switch. it's an endless cycle since the voodoo days with 3dfx.

    And I say the P/P ratio for ati, because even their "budget" cards will run most games decently where nvidia has always played the, "you want to play this <insert game title>, then you need this <insert card model>" for the best performance.

    All my opinion by the way, so take it as it is. :)

    Don't call outdated benchmarks as TRIBBLE? But, they are! They are outdated and inaccurate already.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited March 2010
    20fps is playable. Years ago with my old computer I played with like 15 sometimes and still pwned people.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited March 2010
    Well, now that Nvidia has outright ADMITTED that the recent driver has caused the video card fan to stop spinning at critical times, I think this is an industry first. Nvidia has released the first driver update known to cause widespread FAILURE of the hardware involved in the video card industry.

    There have been many times in the past when both Nvidia and ATI were accused of releasing card killing drivers, but this is the first time it's so widespread as to be confirmed by Nvidia itself now.

    "NVIDIA 196.75 Driver Alert
    We are aware that some customers have reported fan speed issues with the latest 196.75 WHQL drivers on NVIDIA.com. Until we can verify and root cause this issue, we recommend that customers do not download this driver. Instead, please stay with, or return to 196.21 WHQL drivers. Release 196.75 drivers have been temporarily removed from our website and we also are asking our partners and others to remove temporarily this 196.75 WHQL driver as well."

    http://www.pcper.com/
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited April 2010
    Don't call outdated benchmarks as TRIBBLE? But, they are! They are outdated and inaccurate already.
    they are still relevant to a point. They can still let you know what kind of performance you can get with certain engines. and some benchmarks still update for current times and codes.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited April 2010
    UnknownXV wrote: »
    Actually as long as you have a dual core, the CPU makes no difference, at all.

    nope, went from a

    AMD X2 3800+
    2GB DDR
    8800GT
    21 inch CRT

    to the system in my sig and this runs much better.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited April 2010
    Fencer8 wrote: »
    we are still waiting for the 300 sereis from NVIDIA that was not released. So some time in April we are to see this latest card released.

    sorry but the 300 series from nvidia has been out since late Janurary early February, i remember reading that some laptops and a few OEM build were using them as a slightly more efficent 200 series, though even that is a stretch as they in no way are designed to run games through, they are mainly used for media in HEPC's.

    http://www.tomshardware.com/news/nvidia-geforce-gt-300-oem,9727.html

    above link as proof that they are out there, though why you would want one is anyones guess.

    p.s sorry for bringing this dead horse back to life again -runs and hides from the hate and flames-
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited April 2010
    Harpy89 wrote:
    sorry but the 300 series from nvidia has been out since late Janurary early February, i remember reading that some laptops and a few OEM build were using them as a slightly more efficent 200 series, though even that is a stretch as they in no way are designed to run games through, they are mainly used for media in HEPC's.

    http://www.tomshardware.com/news/nvidia-geforce-gt-300-oem,9727.html

    above link as proof that they are out there, though why you would want one is anyones guess.

    p.s sorry for bringing this dead horse back to life again -runs and hides from the hate and flames-

    The "300" series is a massive joke! It's a rebranded 200 series which is a rebranded G90. 9800GTX = GT240 = GT330, they shrunk the die is the biggest change from the first to the last, but performance is the same across all of three.

    On the plus side of that though, is that modern laptops offering GT330M graphics pack a hella punch and are pretty common now. I'd buy one in a heartbeat, even knowing the GPU is nothing but a rebranded 9800GTX on a shrunk process to run a bit cooler and suck less power.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited April 2010
    The "300" series is a massive joke! It's a rebranded 200 series which is a rebranded G90. 9800GTX = GT240 = GT330, they shrunk the die is the biggest change from the first to the last, but performance is the same across all of three.

    On the plus side of that though, is that modern laptops offering GT330M graphics pack a hella punch and are pretty common now. I'd buy one in a heartbeat, even knowing the GPU is nothing but a rebranded 9800GTX on a shrunk process to run a bit cooler and suck less power.

    oh i never said they were a gaming gpu or even a serious gpu, and yes i do agree they are a joke, the only things they are good for is either basic graphic design/college projects, well anything simple really and also they are quite good for watching blu-rays though or any other video media for that matter.

    the power/noise/heat benifits are the only real upsides i could see from them, though if anyone has a 300 series gpu it may be interesting to see how sto runs on one, though i suspect a 9800gtx would run sto much better imo.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited April 2010
    Minor self-correction. G92 was the original 9800GTX, not G90. And I agree, the lower power and cooler running are big benefits on their own.
Sign In or Register to comment.