test content
What is the Arc Client?
Install Arc

How to tell if your computer can handle this game

Archived PostArchived Post Posts: 1,156,071 Arc User
edited July 2014 in PC & Technical Issues
The official system requirements may not be terribly useful to a lot of people, as it's not immediately clear how various video cards or processors compare from the model numbers. Most of the system requirements are pretty straightforward, but I'll give more detailed advice here on what you need in a video card, processor, and memory. If you don't know what hardware you have, you can find out by running DxDiag in the start menu. The "system" tab will show your processor and memory, while the "Display" tab (or "Display 1") will show your video card.

Video card

Check the driver date. If it's not in the last few months, then go update your video card drivers, no matter what card you have. A lot of cards that can handle the game won't work properly if you're using really old drivers. Some quick driver links, since some people seem not to know where to look:

Radeon drivers (for desktop Windows systems): http://game.amd.com/us-en/drivers_catalyst.aspx
Radeon drivers (for everything; less direct link than the line above): http://support.amd.com/us/gpudownload/Pages/index.aspx
GeForce drivers: http://www.nvidia.com/Download/index5.aspx?lang=en-us
Intel drivers: http://downloadcenter.intel.com/Default.aspx

Radeon

If you have a Radeon card with either an X or an HD before a model number, then add the hundreds and thousands digits of the model number (e.g., for a Radeon HD 5770, we get 5+7 = 12). If the number only has three digits, then the thousands digit is zero. If the sum is at least ten, then your card can run this game just fine.

If the sum is between 7 and 9, or if there is an HD before the number (as opposed to an X), then your card can more or less run the game, but you'll have to turn graphical settings way, way down to make the game playable.

If the sum is 6 or less and the model number has an X and not an HD, then you may or may not be able to technically get the game to run, but I wouldn't recommend trying it unless you upgrade your video card. Also, if you have a Radeon with neither an X nor an HD (e.g., Radeon 9250 SE), then it doesn't support DirectX 9.0c at all, so it won't run the game.

GeForce

If your card is a GeForce, then a G and possibly other letters (GT, GTS, GTX, etc.), and then a three digit number, then add the first two digits of that number. If there's an M after the model number, then subtract two from this sum. If the number you're left with is at least 4, then your card can run the game just fine. If the sum is less than that, then you can probably get the game to run, but you'll have to turn graphical settings way, way down to make it playable.

If it's a GeForce, and then a four digit number, and then some letters, then compute the second digit plus (two times the first digit). If that sum is at least 24, then you're set, and your card can run the game just fine. If the sum is 22-23, then your card can handle the game, but you'll have to turn graphical settings way, way down. If the sum is 20-21, then you may or may not be able to get decent performance out of the card by turning settings way down; Nvidia's naming scheme is a complete mess, which doesn't help here. If the sum is less than 20, then I wouldn't try running the game without replacing your video card.

If it's a GeForce and then a three digit number, with no letters between GeForce (e.g., GeForce 310) and the number, then it's a recent card, but very low end (and likely integrated). The game will probably technically run, but you'll have to turn video settings way, way down to make it playable. The exception is if it's a GeForce 256, which is ancient, and can't handle the game.

If it's a GeForce and then a single digit number (e.g., GeForce 4 Ti 4600) or a GeForce FX or PCX, then the card doesn't support DirectX 9.0c. Even if it did, the card would be too old and slow to be of much use.

Others

If you have Intel graphics, then check what processor you have. If it's a Core i3, i5, or i7, then your integrated graphics can handle the game if you turn video settings all the way down. If you have Intel graphics and any other processor, I wouldn't try running the game. Get a video card if you want to play this game.

If you have a Quadro or FirePro card, then it's much messier. Find out what the code name on the GPU chip is (e.g., G92 or RV770), and then find a GeForce or Radeon card that uses a GPU with the same code name. Check the above section to see if the GeForce or Radeon card can handle the game, and the answer for your card is the same unless you run into driver issues. And you might run into driver issues, as your drivers aren't meant for gaming.

If you have graphics that aren't already listed, then assume it can't handle the game. This includes graphics by Matrox, S3, and VIA, among others, as well some very old cards such as the ATI Rage Pro or Nvidia Riva TNT.

Also, please note that the numbers computed above in different sections are not comparable at all. Also, a higher number isn't necessarily better, even in a given section; I just tried to rig things to have reasonably clean cutoffs.

Processor

Find what processor you have, and find the clock speed on it in GHz. On the processor line, it will say something like "~2.8GHz" at the end of the line; 2.8 is the number you're after in that case.

Take that number and multiply it by the number of CPUs that it says you have, if it says 1 or 2. If it says 3 or more, then multiply by 2.5, not the number of CPUs that it says you have.

Next, pick out your architecture from the list below and multiply it by the number given:

Athlon 64: 0.8
Athlon II: 0.9
Core 2: 1
Core i3: 1.2
Core i5: 1.3
Core i7: 1.3
Pentium D: 0.5
Pentium E: 0.9
Pentium G: 1.1
Pentium T: 0.9
Phenom: 0.9
Phenom II: 1
Turion II: 0.9

The Pentium names listed there aren't entirely standard. I mean a Pentium G6950 would be a Pentium G, a Pentium E5500 would be a Pentium E, and so forth.

If you have anything else not listed (e.g. Pentium 4, Celeron, Sempron, Atom, Turion, Athlon XP), then basically assume that it can't handle the game. The exception to this is Opteron and Xeon (server) processors, which have spanned so many architectures that I'm not even going to try. Those should be very rare on a desktop, though. The most common reason to have them in a modern desktop that I can think of is the Xeon in a Mac Pro (not MacBook Pro!), which will perform about like a Core i7, and should have plenty of power for this game.

For example, I have a Core i7-860, which is clocked at 2.8 GHz. I'd take 2.8 (the clock speed) times 2.5 (since DxDiag says 8 CPUs) times 1.3 (for the architecture) to get a number of 9.1.

Anyway, a bigger number is better. If the number that comes out is at least six, then you're set, and your processor can handle the game. If the number is at least 4 but less than 6, then the game will still be playable, albeit with less than ideal frame rates. If the number is at least 3 but less than 4, then the game really won't perform well, but depending on how poor of frame rates you're willing to accept, you might consider it playable. If the number is at least 2 and less than 3, then it's going to be pretty choppy, and while some people would call it playable, I certainly wouldn't. If the number is less than 2, then don't even try. You won't be able to play the game without a better processor.

Memory

This one is pretty easy. If you're running Windows XP, then you need at least 1 GB of system memory, though having significantly more than this (e.g., 2 GB) would be ideal. If you're running Windows 7 or Vista, then you need at least 2 GB of system memory. It doesn't matter whether it's a 32-bit OS or a 64-bit OS. If
you're running Mac OS or Linux using Cedega or Wine or some other such program, then 2 GB is enough that system memory won't be an obstacle, though I can't promise that the software that you're using to handle the DirectX API will work. I'm not sure if 1 GB would be enough for other operating systems; it surely isn't for Windows 7 or Vista.


If your computer can't handle the game, it may be possible to upgrade it. Some computers will go from "game is basically unplayable" to "game performs pretty well" with a $70 upgrade, so it's at least worth looking into. Such an upgrade won't benefit Champions Online only, but also pretty much any other modern game you might want to play, so don't think of it as increasing the price tag on this particular game. Anyway, here's a thread that discusses whether your computer can be upgraded and what to upgrade it with:

http://forums.champions-online.com/showthread.php?t=109623

If upgrading your computer is impractical, it may be better to just buy a new computer. Here's a thread that discusses what to get in a new computer:

http://forums.champions-online.com/showthread.php?t=102211
Post edited by Archived Post on
«13

Comments

  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    Quaternion wrote:
    Radeon
    [...]
    If the sum is 6 or less and the model number has an X and not an HD, then you may or may not be able to technically get the game to run, but I wouldn't recommend trying it unless you upgrade your video card.
    And on that note, it's "true story time". :D

    During Open Beta, I did n;t yet have the machine I'm currently enjoying. I had a Pentium 4 (2.8GHz) CPU, 2.5GB of RAM (2x1GB and 2x256MB, a very poorly-balanced setup) and a Radeon X600. CO ran, and I even got tolerably-playable framerates (in the neighborhood of 15 to 20 - not great, but, it worked ... mostly).

    But it was FUGLY. Almost all of the detail ws missing from the game - I had no reflectivity, so cloth/leather/metal didn't matter for my characters' costumes ... it all looked the same, on me and everyoen else. Similarly, the "detail" level didn't show up. The same sort of things happened to the environment, in-game. And a developer told me in-game, directly and in as many words, that he was surprised that CO could run on that machine.

    So while it might technically run ... don't count on it.

    Adn that bit, leads nicely into ...
    Processor
    Before I (maybe) say anything about what Quaternion wrote, I want to relate the NEXT experience I had with CO, when I took the above P4 computer and put a much nicer video card into it ... this NVidia GeForce GTS 250, in fact.

    And ran afoul of one of CO's little hidden "traps": a lot of the game is solidly processor-bound, and my P4 just couldn't handle it. Y'see, the peculiar mix of P4 and X600 had an unexpected advantage going for it: since my graphics card couldn't even TRY to do a lot of the shiny, special, extra-beautiful stuff?

    It pretended that stuff wasn't even there, to begin with - and thus, didn't burden my CPU with any of it.

    But once I put in a card that COULD attempt those things? A lot of extra work suddenly got dumped on my CPU. Work it couldn't do, not fast enough for the game to be playable. The money I spent on that GTS 250 was wasted, every dollar of it - and it was a $135 purchase, so that was no small sum!

    ...

    The moral of this story is: make sure ALL your components are up to the task of running CO. "A chain is only as strong as it's weakest link", and all that.




    And, by the by, don't let yourself be discouraged if you have to turn things down in CO, to get it to run smoothly. Even with my system (specs below), I don't, and probably couldn't, play with ALL the bells and whistles active, not and keep a good framerate going. :cool:
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    The Radeon X600 doesn't support DirectX 9.0c, but it does support DirectX 9. It's interesting to know that you could get the game to run at all. That's probably why it disabled some effects.

    That GeForce GTS 250 must not have lasted you very long if you replaced it by a GeForce GTX 275 later in the same year that the GTS 250 was released (err, rebranded from a GeForce 9800 GTX+).

    Also, "can run the game just fine" above in the video card portion doesn't mean "can sensibly max settings". It means you'll be able to at least turn some video settings up quite a ways.


    Of course, hardware that can run the game smoothly everywhere on max settings doesn't yet exist. I think I might have found the single most demanding area in the game. Do the level 26 Help a Citizen quest "Bugs in the Contraband". In the last room of the mission instance, there are some small racks of missiles. Stand toward one end, spaced evenly between the two rows of racks. Use an area of effect skill that will hit both rows of missiles. If you do it right, you can set off a chain reaction that will cause 42 missiles to explode simultaneously, each with a pretty big explosion. That will reduce your frame rate to a slide show for a couple of seconds, no matter hardware what you have.

    From running some other programs to monitor the CPU and GPU, it looks like what happens is that it maxes the load on one processor core. Presumably the explosions are single-threaded, so it can only go as fast as that one processor core can go. 42 of them at once really overloads it, which brings things to a standstill. Given that CPU frequency has basically topped out, there may well never be a processor made that can keep a good frame rate if you blow up all the missiles at once like that. (Processors in the distant enough future likely won't be compatible with the game code.)
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    Quaternion wrote:
    That GeForce GTS 250 must not have lasted you very long if you replaced it by a GeForce GTX 275 later in the same year that the GTS 250 was released (err, rebranded from a GeForce 9800 GTX+).
    Oh, that GTS250 is still in that old P4. While it didn't help with Champions Online, it did improve graphics in other applications and games. My g/f uses it on the weekends, when she's not _here_. (Our home life, and our "family", is ... non-traditional, shall we say, and leave it at that. :) )

    Meanwhile, this machine is an entirely new computer, stem to stern, inside and out. The only parts it has in common with the P4, is the old LCD flat-panel monitor ... and me: the fat guy between the chair and the keyboard. :D
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    I like this information you all have posted. I am going to sticky this thread.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    go to yougamers.com and search for champions online, run a system check from that site and it will tell you if your pc can run the game and it will recommend what video card and cpu will be needed to play it at its best
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    While that site can be a useful tool, they don't know the particular details of how various games scale, and a single benchmark really doesn't fit everything. Their system requirements for Civilization 4 are completely laughable, for example. They list a Pentium D as the "optimized" processor, but the game is single-threaded, so having a second core isn't meaningfully better than only having one core. Indeed, it's likely worse, as more cores tends to mean that they have to be clocked lower to keep power consumption in check. Meanwhile, on a Core 2 Duo that is better than their "optimized" processor, the game was processor-bound and ran slow enough for me that I gave up on it until I replaced my computer. Even on my Core i7, it doesn't run as well as I'd like.

    As for Champions Online in particular, the site says that a 2.5 GHz Pentium 4 is the minimum processor requirement. The game would be basically unplayable on that processor. They probably took it from the official system requirements of "2.5GHz Single Core or 1.8GHz Dual Core". The problem with the official system requirements is that it's hard to say how various architectures compare. A modern 2.5 GHz Sempron (a cut down Athlon II) probably would make the game playable, even if it didn't run very well. Or even if one wants to talk about a dual-core processor, there's an enormous difference between a dual core Pentium D and a dual core Core i3; the latter can deliver over triple the performance of the former at the same clock speed in some cases.

    What I wanted to do in this thread was to compensate for such architecture differences, to give people a better explanation of what hardware will run how well. For example, the official recommended system requirement for a processor is "Intel E8400 Core2Duo or Better". And it's accurate that a processor that is at least as good as that 3 GHz Core 2 Duo will run the game just fine. The problem is that most people wouldn't know if, say, an Athlon II X4 630, a Pentium G6950, or a Core 2 Quad Q6600 are "better" than a Core 2 Duo E8400. The above thread will give a clearer comparison--and one optimized for this particular game, rather than a one-size-fits-all benchmark that doesn't know how well a particular game scales to more processor cores.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    This thread is pure genius. Thanks for simplifying the unsimplifiable.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    Quaternion wrote:
    As for Champions Online in particular, the site says that a 2.5 GHz Pentium 4 is the minimum processor requirement. The game would be basically unplayable on that processor.
    Not "basically", at all. I first tried to run CO on a P4 2.8GHz, and once I had a modern graphics card in that machine ... it was literally and completely unplayable. I've seen literal slideshows with higher framerates - and that was while the LOADING SCREENS were being displayed.

    And no, I'm not exaggerating in the slightest when I say that. :(
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    I had a pentium D 2.8 and a 8800GTX and the best I could hope for with a lot of changes on the graphic options was a 35 fps and that was basicly due to the cpu.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    Was hoping to poke my head here and get some feedback from you folks.
    While I like to build my own PC I am far from being a PC egghead. All it means is I can follow directions fairly well. :p

    I think I already know the answer to my question, but would like verification and feedback if possible.

    My current PC is a bit over 2 years old now, it is the following:
    Core 2 Duo E7200 @2.53 GHz (2CPUs)
    2046 MB of RAM
    NVIDIA GeForce 9600 GT (512MB RAM)
    Win XP Home

    My system is starting to show it's age.
    I only play two MMOs, Champions and City of Heroes.
    After the recent video upgrade over there, I thought it about time I try and update my PC a bit.
    In CO I get about 20 fps average, slightly higher for indoor maps.
    I am fairly scroogish when it comes to spending money, so I wanted to upgrade only one thing, and so in order to see the biggest improvement in game performance I'm thinking that my graphics card is my weakest link.

    I was considering getting a GeForce GTX 275.
    However, this might require me to also upgrade my PSU.
    I learned a while back that when building my PCs the large watt number on the PSU box isn't always enough info. I had to check the rail amps to make sure the GPUs are being supplied adequately.

    THis is my current PSU
    http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=3255393&CatId=1483

    I am having a hard time finding out specific info on wether this can handle a single GTX 275

    I recall reading a while back that the GTX 275 requires (I think) 40A.
    My PSU only supplies 32A

    Since I'm not 100% sure on this info, if it turns out I need to also upgrade my PSU, can folks recommend a decent one for a scrooge like me.

    Lastly, when I see different versions of the GTX 275, I see they vary in RAM amounts. Is there really much difference in performace between these similar cards? (Assuming the only thing different is the onboard RAM amounts).

    Thanks in advance for any feedback. Much appreciated.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    Your system as it is should be able to handle the game just fine if you turn video settings down a ways. If the issue is that you don't want to turn video settings down a ways, and want to upgrade so that you can turn video settings up further, there's nothing wrong with that.

    There's really no sense in getting a GeForce GTX 275 new today, unless you can find a really great deal on one or are planning to get one used. In your price range, the reasonable choices for a new video card are a Radeon HD 5770 ($150) or a Radeon HD 5850 ($300). The problem with the GeForce GTX 275 is that it typically has performance closer to the 5770, a price tag closer to the 5850, power consumption much greater than either, and a rather dated feature set, most notably lacking DirectX 11 support. If you can find one for $150, then sure, have at it, but don't pay $250 for a GTX 275.

    Regarding video memory, nearly all GeForce GTX 275s have 0.875 GB (896 MB) of video memory. There might occasionally be one that has 1.75 GB (1792 MB) of video memory. There could occasionally be other amounts if a board manufacturer leaves a memory channel empty, which would be irresponsible and something Nvidia should crack down on. I haven't seen that on Nvidia cards (save the GeForce GT 330, which is basically a failed G92b die where they just disable whatever doesn't work and sell the rest), though occasionally I'll see it on an ATI card.

    Video memory, like system memory, is basically a case of, if you have enough, then adding more won't help, but if you don't have enough, you take a painful performance hit. If you're using a 2560x1600 monitor, then 0.875 GB often won't be enough video memory, but for 1920x1200 or smaller, the extra video memory won't help you.

    It looks like this is your power supply:

    http://www.ultraproducts.com/applications/SearchTools/item-details.asp?EdpNo=3255393&CatId=1483

    It's a single rail design with 32 A on the +12 V rail, so it can deliver 12*32 = 384 W on the +12 V rail. Basically, it's a 400 W power supply masquerading as a 600 W power supply. That's the sort of thing that reputable power supply manufacturers just don't do, so that's probably not the only thing wrong with it.

    It wouldn't be able to power a GeForce GTX 275. That card requires two 6-pin PCI-E power connectors, and your power supply only has one. If you were to get a Radeon HD 5770 (which has performance similar to a GeForce GTX 260), you'd be able to keep your current power supply. Though there's a decent case for replacing the power supply just on general principle, as system stability and hardware longevity are good things.

    As for a new power supply to get if you want a new one, this one is a great deal, at least if you're willing to mess with mail-in rebates:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16817371020

    Actually, it's a pretty good value even if you don't do rebates. It has a markedly lower price than it used to have (and not just on New Egg), so I somewhat suspect that Antec is trying to clear inventory in advance of a refreshed version. For what it's worth, it's rated at 45 A on the +12 V rail, which comes to 540 W on the +12 V rail alone. Calling it a 550 W power supply is no exaggeration.

    As I said above, I'd recommend getting a Radeon HD 5770 or 5850, as those are the ones that are a good value for the money in the price range you're looking at. There aren't any cards in between those in performance that are a good value for the money unless you find a special deal or something. The Radeon HD 5770 has a TDP of only 108 W; for comparison, your current card has a TDP of 96 W, assuming it's at the stock clock speed. The Radeon HD 5870 has a TDP of 151 W, while the GeForce GTX 275 has a TDP of 219 W. A Radeon HD 5770 will give about double the performance of your current video card, while a Radeon HD 5850 will probably more than triple it. Don't count in tripling your frame rates, though, as you'd be processor-bound first.

    If you're really set on a GeForce GTX 275, you should check your case to see if it has adequate airflow to accommodate a video card that dissipates 219 W. (Well, the TDP tends to overestimate, but you're probably looking at significantly over 150 W in some games.) One air vent to draw air in and one case fan to blow air out won't cut it, unless perhaps the card is designed to release most of its heat directly out the back of the case. If you've got several case fans, including one on the side blowing air right at the video card, you'd be fine, though.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    Quaternion wrote:
    Your system as it is should be able to handle the game just fine if you turn video settings down a ways. If the issue is that you don't want to turn video settings down a ways, and want to upgrade so that you can turn video settings up further, there's nothing wrong with that.
    Yeah, Sorry about that. I should have been more clear on what I was trying to say. Bad habit of mine. :o
    I currently have most settings that I like turned up to max, or nearly max. Some settings that I don't particularly like, such as bloom, depth of field, etc. I generally turn completely off, even if I had the best GPU in the world, I would turn them off as I dislike many of their visual looks.

    My goal now is to be able to max the settings that I do like and have a nice frame rate all around. Especially when on a team and having multiple graphical effects going on. When I solo, my GPU handles things fairly well, especially on instanced maps where I get ~30 fps.

    FYI, I run dual Acer 22" wide screen monitors at 1680x1050 powered by just my single GeForce 9600 GT. I actually think it does quite well handling that load for such a cheapy card (at least it was a cheapy card back when I got it. :p ) Granted the second monitor isn't doing much other than displaying static images and web pages etc while the main screen is running the game.
    Now my card is probably at or very near the bottom of the list.
    Quaternion wrote:
    There's really no sense in getting a GeForce GTX 275 new today, unless you can find a really great deal on one or are planning to get one used. In your price range, the reasonable choices for a new video card are a Radeon HD 5770 ($150) or a Radeon HD 5850 ($300). The problem with the GeForce GTX 275 is that it typically has performance closer to the 5770, a price tag closer to the 5850, power consumption much greater than either, and a rather dated feature set, most notably lacking DirectX 11 support. If you can find one for $150, then sure, have at it, but don't pay $250 for a GTX 275.
    I should also have mentioned that I tend to avoid ATI cards. I prefer NVIDIA cards.
    As for specific NVIDIA models, that I'm open for debate because their naming standards tend to go way over my head and make no sense to me. :confused:
    I was planning on waiting til around November or so to get the card, my hope was that by then the card would be under 200$. (hopefully closer to 150$) I try never to spend more than 200 bucks on a video card. That is my general cut off price point.

    But what you are saying about the GTX 275 makes me want to ponder on whether that might not be a good card to get like I originally thought it was.
    Quaternion wrote:
    As for a new power supply to get if you want a new one, this one is a great deal, at least if you're willing to mess with mail-in rebates:

    http://www.newegg.com/Product/Produc...82E16817371020
    THis is where I have a hard time with things.
    I am looking at your link to that power supply and I am seeing stats under power output, but do not see anything that "clearly" tells me that it provides 45A on the 12V rails.

    This is what I am looking at FYI:
    +3.3V@25A, +5V@25A, +12V1@20A, +12V2@20A, +12V3@20A, +12V4@20A, -12V@0.8A, +5VSB@3.0A

    I am not doubting you of course, I'm just not able to understand the info and extrapolate what I need from it. Any insight to this would be much appreciated. :D

    Final thoughts:
    So I assume everything else except for my GPU is fine then since you didn't comment on those negatively.
    I was unsure how to calculate my Core 2 Duo using your opening post calculation methods, but I think it was within acceptable values assuming I did it correctly.

    Again, Thank you very much for all of the info, greatly appreciated.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    If you want to know whether your processor is good enough, try turning video settings way, way down and see what frame rates you get. That's about what your processor can deliver if you had a really great video card. Some video settings to affect the processor load, but it's not nearly so big of an effect as how hard settings will make a video card work.

    Don't count on prices on a GeForce GTX 275 falling. AMD can deliver the same performance as Nvidia in cards that cost vastly less to build, so they kept cutting prices and daring Nvidia to match. Finally Nvidia gave up and discontinued their GT200b cards (including the GTX 275) rather than taking a loss on every card they sold, and hiked prices considerably to keep them in stock for quite a while, both as a way for fanboys to give them money, and to avoid the embarrassment of having obviously retreated from the high end market.

    If you want an Nvidia card today, you've basically got three options:

    1) Get a GeForce GTS 250, which is the third most recent name of their old 8800 series cards, and the modern name for the top bin of the G92b chip. That may give you a little under double the video performance of what you have now, and for around $100. That's the best card that Nvidia has that is a good value for the money.

    2) Get an old GT200b card at an absurd price. It will perform well, but you'll basically pay an extra $50 for the privilege of having an Nvidia logo on it, in addition to using a lot more power than more modern cards and lacking DirectX 11 compatibility.

    3) Get a new GF100 "Fermi" card, that is, a Geforce GTX 465, 470, or 480. Or a "Thermi" as they're sometimes called, as they run dangerously hot at even moderate loads, in addition to being some of the noisiest cards since the old GeForce FX 5800 "Dustbuster". It's a complete train wreck of a chip, as even the top bin halo part is a salvage part (part of the chip disabled). There are certain gains in performance per watt and performance per mm^2 of die size that Nvidia should theoretically have gotten by moving from a 55 nm process to a 40 nm process. Nvidia somehow failed to get those gains. AMD did get the gains they should have gotten, which is why ATI cards are the only reasonable ones to buy at $150 and up. For example, check here:

    http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_465/30.html

    AMD's entire modern lineup of cards (including the pathetic crippled chip 5830) absolutely slaughters Nvidia's entire modern lineup in performance per watt. Even some of AMD's previous generation beats Nvidia's latest and greatest.

    Fortunately for you, however, you're not urgently looking to buy a new card right now. You can wait for Nvidia's upcoming GF104-based cards to release. The top bin of it might be branded as a GeForce GTS 460 or GeForce GTX 460, though Nvidia's naming scheme is such a mess that it's pretty unpredictable. That will basically be half of a GF100 chip. If Nvidia has figured out why GF100 is such a disaster and fixed it, then GF104 could well be a decent chip. I'd expect performance in the ballpark of a GeForce GTX 275 or 285, with a TDP around 150 W and a price tag of $200-$250 or so

    As for when it will release, that's the big question. Nvidia's execution has been miserable of late. The GeForce G 210, GT 220, and GT240, the mid-range and low-end derivative cards of the original GeForce GTX 200 series, were delayed by about a year. Two of the derivative chips were cancelled entirely. The public due date for the "Fermi" chips was October 22, 2009, the release date for DirectX 11 that Microsoft had announced long in advance. AMD had two chips out by then, with cards based on multiple bins of each, and then filled out the rest of their lineup in January and February. Nvidia didn't get their first GF100 cards to market until April (so basically a delay of six months), and there aren't yet even credible rumors about the release of the rest of their lineup. My best guess is that GF104 will be out this summer, but it wouldn't be completely shocking if it's not out by November, when you're looking to buy a card.


    For the power supply, click the picture of the green and black label. Below the table it says +12V1, +12V2, +12V3, +12V4 max load 45A. It would probably be better if it were labeled more clearly, but it is there.

    If you're not going to buy something until November, prices will probably have changed by then, so that power supply likely won't be such a great deal anymore. I wouldn't recommend paying $100 for it, for example. I just picked the power supply that would handle whatever video card you want that was the best deal I saw at the moment.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    I want to emphasize, (if it hasn't already been brought up), that this game is unusual in that it actually puts more of a load on the CPU than possibly the GPU. I had a computer w/ a Pentium D 3Ghz and a GTX 260, and the game ran poorly. Once I upgraded to a Core 2 Duo, the game performed extremely well, (18FPS before to 40+ after). I have a laptop w/ a pretty good AMD dual core and a low-middle of the road HD 4330 and the game actually runs pretty well. Short story is, you need a video card that's "powerful enough", but a more powerful CPU will do you a world of good. If you had a decent video card but only an "ok" CPU, upgrade the CPU. If you had a good CPU and a lower end GPU, you can probably already play the game pretty well, (unless it's a really old video card...)
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    I've added links at the bottom to a thread on upgrading a computer, as well as one on picking hardware for a new computer.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    Ashlocke wrote:
    I should also have mentioned that I tend to avoid ATI cards. I prefer NVIDIA cards.
    Then it's a fight, between yoru urge to be frugal, and your brand-loyalty to NVidia. :)

    Right now, and probably for the at least next 12-18 months, ATI has thehands-down lead in terms of "bang for your buck". Note, I say this despite having a GTX275 of my own in this very PC; my Lady-love's computer has about the same graphics muscle as mine does ... and it's using an ATI HD 5770 card.
    But what you are saying about the GTX 275 makes me want to ponder on whether that might not be a good card to get like I originally thought it was.
    If you can find one in your price range - and bear in mind, that will require a PSU upgrade, because yes, looking at my 275 shows there are indeed TWO power connectors hooked to it - anyway, ifyou can swing a new PSU and a GTX 275 at a price point you're comfortable with, the card isn't bad at all.

    It's just that you could, with less Nvidia loyalty, very likely get the same benefit for less dollars.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    It's not so much brand loyalty and more like tempting the voodoo gods to smite me yet again.
    I'm not superstitious by any means, I just happen to have some freakishly bad luck.

    I have had 4 ATI cards in my lifetime, all 3 died mysterious deaths within 3 months of being installed.
    I have had 2-3 times as many Nvidia cards, and all have taken every beating under the sun unscathed.

    ATI has been the more expensive choice so far for me. :D

    Oddly enough I have had similar karma with hard drives. I buy only Western Digital these days. None of mine have ever failed. Yet my Maxtors, Quantums and Seagates have had similarly freakish deaths.

    Yes I realize it's all random chance, but such is my luck. :confused:
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited June 2010
    Unless those 4 ATI cards were pretty recent, that may well be ancient history, as it's a different company entirely now. ATI is now AMD's graphics division, rather than an independent company.

    For what it's worth, an ATI GPU chip and an Nvidia GPU chip are both manufactured at TSMC on exactly the same machinery, so manufacturing differences shouldn't cause one to be more likely to fail than another. You can get cards for either company assembled by exactly the same board manufacturers, too, so that shouldn't cause a difference, either, at least in a non-reference model. I think Gigabyte, Asus, MSI, and XFX are the ones that make cards for both sides.

    You can also buy a card, register it, and get a lifetime warranty from some companies. I'm pretty sure that XFX does that, and I think some other board partners offer it, too. On the Nvidia side, EVGA does, too, and I'm not sure about others; BFG used to, but they no longer make video cards. That way, if the card dies in three months, it's a nuisance to wait until it gets replaced, but not really expensive to replace it.

    For what it's worth, in the latest generation, an Nvidia card is probably more likely to die on you than an ATI card, because the Nvidia cards have far worse performance per watt, meaning that they put out far more heat for the same performance, and hence run far hotter.

    The standard rule of thumb in the industry is that if you make a computer part run 10 C hotter, it will only last 1/10 as long before it fails. That's how they come up with claims like a mean time to failure of one million hours: they ran it 40 C (or some other number; I don't know exactly what they do) hotter than they think you should, and found that on average it died in 100 hours.

    But if you want to pay a fanboy tax, go ahead. Or wait for GF104 and hope it's good.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited July 2010
    I've been running CO on a sub-minimum spec computer since Closed Beta:
    Pentium 4 2.0 overclocked (lol) to 2.2
    1 Gb RAM
    Radeon HD 3850 AGP

    I also use this computer to play STO.

    My graphics are indeed set low, but few are actually set at the lowest settings.

    I'm only adding this to the discussion so that potential CO players who have underpowered machines and are unable to upgrade for the moment will at least give the game a try.
    They may find that the game actually runs fine on their computer...
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited July 2010
    As listed in the original post, your video card can handle the game just fine. You'll have to turn video settings down quite a ways, but the video card isn't the real problem.

    The real issue is your processor, which is horrible. Different people have different thresholds of what they regard as a good enough frame rate, and if you're playing this game on a 2.2 GHz Pentium 4, you've probably got an awfully low threshold of what you're willing to accept.

    On your system, you should be able to turn up video settings that only tax the video card and don't affect the processor. Any settings that affect processor usage should be set to the absolute minimum.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited July 2010
    Quaternion wrote:
    you've probably got an awfully low threshold of what you're willing to accept.

    Actually, since I don't have as much eye-candy to gawk at, it's even more important to me that the game runs smoothly.

    But I wasn't really soliciting a critique of my system anyway. As I said, I have been playing CO on it since before launch, so I know how well it runs. According to the minimum system specs as listed in the original post, it shouldn't run at all. Yet it runs surprisingly well. I hear people with much better computers complain about their framerate all the time. Even during major world events like Clarence, it has been my characters' builds that have been the limiting factor rather than system performance.

    The whole purpose of my original post was to urge people who may be interested in CO to try the game before they assume their computer is not up to the task. They may actually find that their current computer runs it well enough that they can play CO while they work on upgrading to something better.
    I would rather have someone playing and enjoying CO right now, rather than putting off CO until their computer is 'good enough' by which time they may have been distracted by some shinier, newer game. It happens.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited July 2010
    So, IOW, you're saying to people "even if your computer is below what Cryptic says is required to run the program, throw money at it anyway, and just hope" ...?

    Talk about bad advice ... :(:rolleyes:
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited July 2010
    Trying the trial isn't exactly throwing money at it. It is quite a bit of a download, though.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited July 2010
    Quaternion wrote:
    Trying the trial isn't exactly throwing money at it. It is quite a bit of a download, though.

    Precisely. There is no harm in trying the game before embarking on an upgrade path, especially if someone is already considering playing (which would require downloading it anyway.)
    By trying the game out on one's current system, one can more easily see where upgrades are necessary, and can better assess whether upgrading will require a whole new computer or merely a new video card or processor.
    If the game runs well enough, they may even decide to wait a bit longer so they can build a computer that's 'Mr. Right' rather than 'Mr. Right Now'.

    _Pax_ wrote:
    So, IOW, you're saying to people "even if your computer is below what Cryptic says is required to run the program, throw money at it anyway, and just hope" ...?

    Talk about bad advice ... :(:rolleyes:

    Yes that is bad advice. It also exactly the opposite of what I've said.
    What I have been saying is try the game on your computer before you start throwing money at it.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited July 2010
    I misunderstood you, then. Mea culpa. :)
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    I am buying a new desktop, with the following specs, and was wondering if Champions will play okay/well on it?

    Here they are:
    AMD Athlon™ II X3 435(2.9GHz,1.5MB)
    4096MB Dual Channel DDR3 [4x1024] Memory
    500GB (7200rpm) SATA Hard Drive
    Nvidia ® GeForce G310 512MB graphics card

    Thanks in advance for the help!
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    Will the game technically run? Yes. But expect to have to turn video settings most or all the way down to make it playable. If it's a computer that you're planning on playing games on, then you really should get a decent video card, and not a low end "don't try to play games on this" version that is basically intended to function as integrated graphics for a motherboard that doesn't have integrated graphics.

    If you haven't bought the computer yet, then have a look at this thread and buy something more sensible:

    http://forums.champions-online.com/showthread.php?t=102211

    If you have a budget in mind, I can help you pick out parts.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    Aside from the video card, are the other specs okay? Thanks for the help!
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    I'd rather see a lot more details in the specs. For example, the computer probably has a cheap junk power supply, which certainly isn't okay, but you didn't say what power supply it has. If you have gaming in mind and want to upgrade it to more powerful parts, then the case probably won't have enough airflow, and better parts might not physically fit. You didn't say what motherboard the computer has, so I can't tell if it has the modern features that you should look for, to make sure that anything you want to attach to it will work just fine.

    As for the specs you did list, 4 GB of system memory is the right amount. But you should get it in two DIMMs of 2 GB each. That means less power consumption, and also the ability to add more memory later if it becomes necessary a couple of years down the road. DDR3 is the right type of memory, but it doesn't list the frequency. You want 1333 MHz memory if you think you might ever upgrade the processor. If you're certain that you won't, then 1066 MHz memory would be fine.

    The processor is a pretty good value for $75 or so. It's a good choice for a gaming machine on a tight budget.

    The hard drive only lists the capacity and the RPM. I can guarantee you that the OEM making it will pick the cheapest (and likely therefore slowest) hard drive they can find that technically meets those specs. That's fine if you don't mind frequently waiting for things to load, but don't expect a fast, responsive system if they won't tell you what they're using.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    So I know that my comp is more than capable of running the game, but for some reason I've had sever screen tearing every time I try. I've been playing WoW, City of Heroes, EVE, StarCraft 2 all with the graphics maxed out and at least 40 or so fps in all the other games, but for some reason I can't hardly see a damn thing in this one. I attached some screenshots so you could see what the problem is. Any info or help would be greatly appreciated.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    would be better to post a dxdaig report of your system.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    Argorios wrote:
    So I know that my comp is more than capable of running the game, but for some reason I've had sever screen tearing every time I try. I've been playing WoW, City of Heroes, EVE, StarCraft 2 all with the graphics maxed out and at least 40 or so fps in all the other games, but for some reason I can't hardly see a damn thing in this one. I attached some screenshots so you could see what the problem is. Any info or help would be greatly appreciated.

    Some of those are very old games. Anyway, the pictures you link look like they could be a driver issue, not insufficient performance. Anyway, try attaching your DxDiag file to show what you've got.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    Ok here is the dxdiag. that is the 32 bit. I could run a 64 bit to and put it up if that would make any difference.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    Try turning off CrossFire and just using one GPU, and see if that makes any difference. Also try turning graphical settings way down and see if it fixes the tearing. Some people have reported issues with dynamic lighting.

    As you've got a multi-GPU card, you should be wary of overheating. How hot does your video card get at load while playing the game?
    DxDiag wrote:
    Device Name: SteelSeries World of Warcraft MMO Gaming Mouse Human Interface Device

    I'm not sure if that's the most ridiculous name for a mouse that I've ever seen, but it might be. It's not a problem, of course. It's just a ridiculous name. :D
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    really have no idea how hot it gets, but the tearing isn't something that happens over time. its there as soon as I open the game. ill give the crossfire thing a shot. and as for the mouse yea its a weird name but its got so many buttons it makes playing more fun for me.



    edit: I turned off the crossfire and it didn't help at all. I actually think it made it a bit worse.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    Try turning on safe mode in the launcher. That will force all settings to the minimum. If that fixes the problem, then it's probably some video setting that is causing it. You can try turning various things up and down and seeing what happens.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    safe mode didn't do anything.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    You keep Windows updated, I hope? It might be a DirectX issue. I'd try seeing if you can reinstall DirectX and reinstall the game and see if either of those make any difference.

    My computer with Windows 7 had DirectX 11 install when I installed Windows, but Civilization 4 didn't like it for some odd reason. Also installing DirectX 9.0c as downloaded from Microsoft fixed it. I'm not sure that that should have fixed it, but it did.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited August 2010
    eh thanks for the help but i give up. looked like a great game but seems like im never gonna get to find out.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2010
    I just wanted to post a big thank you to the devs for putting this thread up and making it sticky.

    I have an old rig and it was having difficulties with the game. Specs:

    Athlon x2 2.3 GHtz
    Asus M2A-VM HDMI
    4 Gig DDR2-800
    Powercolor 3850 w/512MB
    Phillips 1440 x 900 monitor

    With graphics set to low I was getting about 25 frames per second which drop to about 15 here and there. I was thinking that my graphics card was the issue. My system meets all of the "recommended" criteria for running the game but the video card is at the threshold of what is recommended and the processor is bit better than the recommneded processor (which is a 1.8 GHtz dual core). Anyway I found this thread. Based on the first post in this thread my ram is fine (no surprise there), the video card is okay but the processor scores 3.7 where 4 is the bottom of the useable range. Just to be sure I ran the game with task manager in the background and sure enough each core is running around 90%

    Since I don't want to spend the money to get a new computer I deciced to look at upgrade options. Using the formula in this thread I decided to get an Athlon x3 3.1Ghz processor. The bonus being that it will work with my four year old motherboard. (The disadvantage being that I've never upgraded a processor before and it was a bit of a pain to flash the bios. At first the machine refused to POST :eek:) Using the formula from the first post the new processor gets a 7 where anything over 6 is ideal.

    With just the processor change I am getting about 55 frames per second. I've boosted the graphics options and now get about 40 fps but the trade off is worth it visually. Task manager shows all three cores running at about 75% :D

    cheers,
    tallfoolvictor
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2010
    Good to hear that it worked for you. A better video card wouldn't so much improve your frame rates as let you keep the same playable frame rates with higher video settings. Whether that's worthwhile is a matter of opinion. For what it's worth, your Radeon HD 3850 is roughly comparable to a more modern Radeon HD 4670 or 5550.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2010
    Quaternion wrote:
    Good to hear that it worked for you. A better video card wouldn't so much improve your frame rates as let you keep the same playable frame rates with higher video settings. Whether that's worthwhile is a matter of opinion. For what it's worth, your Radeon HD 3850 is roughly comparable to a more modern Radeon HD 4670 or 5550.

    Since you mention it I was considering a 5770. Nvidia just released the 450 which is also a possibility. I'm holding off for a while to see if AMD release the 6000 series soon.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2010
    Since you mention it I was considering a 5770. Nvidia just released the 450 which is also a possibility. I'm holding off for a while to see if AMD release the 6000 series soon.

    The Radeon HD 6770 will launch in about a month, and the Radeon HD 6870 about a month after that. The Radeon HD 6000 series ("Northern Islands") is a new architecture, and likely the biggest architecture change since the Radeon HD 2000 series. It will be made on the same TSMC 40 nm bulk silicon process as the Radeon HD 5000 series and GeForce 400 series cards.

    There aren't credible rumors on pricing yet, but I'd expect the 6770 to sell for somewhere in the $150-$200 price range initially, with performance to match--so it should handily beat a Radeon HD 5770 or GeForce GTS 450.

    For what it's worth, when Nvidia launched the GeForce GTS 450 for $130, AMD said they'll cut Radeon HD 5770 prices to $130 to match. The 5770 is both faster and cheaper to produce than the GTS 450, so it's really just a question of whether AMD wants to sit and make a fat profit on each, or claim more market share. Nvidia has been pricing their GTX 460 and GTS 450 cards aggressively, though, which gives fans decent value for the money if they want go go Nvidia.

    Nvidia won't have anything new for desktops until the second half of 2011 at the earliest, apart from low end GF108-based cards that may not be any faster than what you already have and possibly minor tweaks to existing cards (e.g., a GF104-based card with the full GPU chip active).
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2010
    Fantastic information in this thread, and a great way to stratify the pieces of hardware in a straightforward manner, given some of the crazy naming schemes out there.

    If I might pick your brain for a moment, I have a question. My secondary rig (previously my primary) is as follows:
    C2D E6600 (2.4GHz)
    EVGA 680i
    WD Caviar Blue 320 GB
    2GB DDR2 RAM (I can't remember off the top of my head what brand; I'm at work)
    Corsair 650W PSU
    Turtle Beach SantaCruz (Don't look at me that way! I have my reasons)
    Sapphire 5770 1GB (Not a Juniper XT. Recently replaced EVGA 8800GTS 640MB launch model)
    Window XP Home SP3


    Given the above, would you expect there to be any tangible improvement in going from an E6600 to an E8400? The Wolfdales are built on a 45nm process, as opposed to the 65nm of the E6600. It's about the strongest affordable CPU the 680i can apparently take. I don't think it would provide a big enough boost to warrant it, but perhaps I'm underestimating what it'd do and whether or not the heat/power consumption difference would be noteworthy.

    Then again...I don't really want to pull apart my heatsink contraption. God, what was I thinking when I bought that thing?
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2010
    I think you'd see a substantial performance improvement in Champions Online. But Intel charges crazy prices for their old Core 2 Duo processors, making it not a good value upgrade, unless you can find a Core 2 Duo E8400 for a whole lot cheaper than the $170 that New Egg charges. You say it's your secondary machine; if you don't use it much, it may not be worth putting any money into it at all.

    The relevant thing on memory is not the brand name but the clock speed. Even if the brand name mattered, I'd think it would be the company that built the memory chips that would matter more than the company that assembles modules and markets them, and you generally don't know what you're getting there. Well, I guess you'd know that Crucial memory uses Micron chips, since they're the same company, but otherwise, you wouldn't know if you're getting Hynix, Elpida, Samsung, or whatever. Not that there's any real reason to care.

    What does matter, though, is the clock speed of the memory. 400 MHz DDR2 could be enough of a bottleneck that you might not see the performance boost that you ought to. 800 MHz or higher would be fast enough to not be an issue at all. You can check your memory speed in the BIOS, or you can download and run CPU-Z and check "DRAM Frequency" in the memory tab. Note that CPU-Z doesn't double the clock frequency for DDR the way that memory manufacturers do.

    If you are looking to upgrade a processor, there are better options, though. For starters, there's this:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16819116371

    That doesn't have as much cache as a Core 2 Duo, but it does offset it with a higher clock speed, likely giving around the same net performance.

    Conroe and Wolfdale use about the same power. Wolfdale tends to be clocked higher, but that's offset by the die shrink. They have the same official TDP of 65 W, but that's a very generous 65 W, and if Intel were to call it 50 W, no one would complain. (Except maybe AMD. Maybe.) Your power supply would be massively overkill for either with that video card, assuming it's something reasonably modern (TX or HX) and not some ancient power supply that I'm unaware of.

    Another alternative that would actually give you better performance is an Athlon II X3 from AMD:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16819103872

    That would take a new motherboard, of course. But you can get an old DDR2 motherboard for cheap:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16813128342

    That plus the processor added together is $40 cheaper than a Core 2 Duo E8400 alone.

    If you were to get a slightly more modern motherboard like this:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16813130290

    it would be more expensive, and you'd also have to replace the memory with DDR3. 4 GB of DDR3 can be had for as little as $70 now:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16820146748

    Still, that plus a new processor and motherboard is about $220 for what you describe as a secondary machine, and probably more than you'd want to sink into it. You might run into trouble with a Windows license if you replace too much, too.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2010
    Thank you for the very in-depth reply. I did dig up my old order and find the RAM inside that rig. My mind is blown at how much I paid for it back when it was new. @_@

    If by DDR2 400MHz you're referring to PC2 3200, then I'm in the clear. If you mean PC2 6400, then that is indeed what it is. While the board will apparently support up to DDR2 1200 (PC2 9600), I've heard stories about the "extreme" clock speeds bringing out critical heat issues on the EVGA 680i, hence my pick of PC2 6400 at the time.

    http://www.newegg.com/Product/Product.aspx?Item=N82E16820145590


    I have noticed that the old Core 2 Duo's remain high in price. It seems pretty crazy, when a Phenom II X4 965 costs a few dollars less. The same thing happened when I went to replace my P4 Northwood 2.4GHz with a 2.8GHz Northwood (the most that particular board would handle). The prices were baffling for a CPU that was more than a little antiquated at the time. How do they get away with that?

    Unfortunately a platform shift definitely isn't in the budget, as it would probably red flag my Windows XP as well. All said and done, it'd be more than I ought to spend on a rig its age. I do, however, much appreciate the E6800 link. I hadn't even considered a higher-clocked E6X00 CPU, and that costs immensely less than a Wolfdale.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2010
    A Pentium E6800 is a Wolfdale. It might well be a physically different die, which would account for the difference between 2 MB of cache and 6 MB. But it's still a 45 nm chip. Having the extra L2 cache does make the die more expensive to produce, as cache takes a lot of space. An Athlon II is just a Phenom II without the L3 cache, for example. (In some cases, it's even a Phenom II die with the L3 cache disabled because it didn't work.)

    But really, Intel probably just charges that much because they can. If you've already got the motherboard, few people are going to be willing to replace the motherboard in addition to the processor to switch to AMD.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2010
    Quaternion wrote:
    A Pentium E6800 is a Wolfdale. It might well be a physically different die, which would account for the difference between 2 MB of cache and 6 MB. But it's still a 45 nm chip. Having the extra L2 cache does make the die more expensive to produce, as cache takes a lot of space. An Athlon II is just a Phenom II without the L3 cache, for example. (In some cases, it's even a Phenom II die with the L3 cache disabled because it didn't work.)

    But really, Intel probably just charges that much because they can. If you've already got the motherboard, few people are going to be willing to replace the motherboard in addition to the processor to switch to AMD.

    Great information. I didn't read the details as closely as I should have, so I missed that it was also a Wolfdale. :]

    So, was it PC2 3200 or 6400 that you were referring to as being a potential bottleneck? Just to get that off my mind, one way or the other. The way DDR is denoted can be a little confusing sometimes when talking about the actual speed as compared to the "doubled" speed that's printed on it. :o
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2010
    PC 6400 is 800 MHz DDR2, which will have plenty of speed for any dual core processor. Well, maybe Intel's desktop Core i5s with hyperthreading could use more memory bandwidth than that, but the chipsets for those require DDR3 memory, so they also get all the bandwidth they need.

    Or you can check your memory clock speed directly. Download CPU-Z and run it, click the memory tab, and look at DRAM Frequency. If it says 400 MHz, then you're set. 400 MHz is the real clock speed of 800 MHz DDR2, and memory manufacturers multiply it by 2 because, well, really because their competitors do and they don't want people to think their memory is a lot slower. It's kind of like how hard drive manufacturers pretend that 10^3 = 2^10, and once one started and got away with it, they all had to do that to be competitive.

    AMD is trying to clamp down on this silly multiply by 2 business, by listing the real clock speed of GDDR5 in its video cards. GDDR5 is quad data rate, so for the "effective" data rate, you'd multiply by 4. Then Nvidia couldn't make a GDDR5 memory controller that works right and has to clock it a lot slower than AMD, so Nvidia decided to multiply by 2 to make their memory speeds look somewhat faster than AMD's, rather than a lot slower. Multiplying by 2 doesn't make a bit of sense at all for GDDR5, but maybe Nvidia figured that if they multiplied by 4, people would catch on that the numbers weren't comparable.
  • Archived PostArchived Post Posts: 1,156,071 Arc User
    edited September 2010
    Quaternion wrote:
    PC 6400 is 800 MHz DDR2, which will have plenty of speed for any dual core processor. Well, maybe Intel's desktop Core i5s with hyperthreading could use more memory bandwidth than that, but the chipsets for those require DDR3 memory, so they also get all the bandwidth they need.

    Or you can check your memory clock speed directly. Download CPU-Z and run it, click the memory tab, and look at DRAM Frequency. If it says 400 MHz, then you're set. 400 MHz is the real clock speed of 800 MHz DDR2, and memory manufacturers multiply it by 2 because, well, really because their competitors do and they don't want people to think their memory is a lot slower. It's kind of like how hard drive manufacturers pretend that 10^3 = 2^10, and once one started and got away with it, they all had to do that to be competitive.

    AMD is trying to clamp down on this silly multiply by 2 business, by listing the real clock speed of GDDR5 in its video cards. GDDR5 is quad data rate, so for the "effective" data rate, you'd multiply by 4. Then Nvidia couldn't make a GDDR5 memory controller that works right and has to clock it a lot slower than AMD, so Nvidia decided to multiply by 2 to make their memory speeds look somewhat faster than AMD's, rather than a lot slower. Multiplying by 2 doesn't make a bit of sense at all for GDDR5, but maybe Nvidia figured that if they multiplied by 4, people would catch on that the numbers weren't comparable.

    Yes, the Hard Drive size thing has irked me for years. Whenever my friends get a new hard drive and don't understand why it's smaller than what the box says, I have to go over that. X)

    That said, it is indeed PC2 6400/400MHz real/800MHz when doubled, as confirmed by the NewEgg listing, my BIOS, and CPU-Z. I just got a little bit lost in the terminology, with regards to whether you were referring to 400*2=800 or 200*2= "400." I'm glad to hear AMD is trying to clamp down on that. It is a source of common confusion that often makes one do a double-take to get things straight.

    So it sounds like the old dog should be fine with regards to memory, and that's a relief. One less thing to consider. I'm going to keep the E6800 bookmarked in the event that I ever decide to pour a little more money into that thing. I just dread removing the Zalman heatsink (what was I thinking!?).

    The new one is certainly swimming in over-the-topness. I'd have made an i7 860/870 system, but I found a crazy deal on an i7 930 and a 1366 motherboard that brought the collective prices of the two so close that I just ended up with the i7 930.
Sign In or Register to comment.