test content
What is the Arc Client?
Install Arc

Thermal and overheating issues!

13468920

Comments

  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Spots wrote:

    However, swapping my new ATI 4890 out for older, much bigger two nVidia 8800 GTXs solved my overheating problems (with bigger, more heat producing, more air blocking hardware?) and it runs faster. Like it or not, I think this indicates STO has some ATI unfriendly code in there, in some manner.

    ITs all down to how the original engine was optimized. very rarely they a game will run the same on both ati and nvidia. But with some patches to the sourcecode this can be sorted at a later date.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    So we've figured out that there are 2 problems.

    1. The overheating issue is the fault of the user.

    2. The overuse of hardware is STO's fault.

    sarcasm There is no way to fix any of this! /sarcasm ..
    :cool:
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    So we've figured out that there are 2 problems.

    1. The overheating issue is the fault of the user.

    2. The overuse of hardware is STO's fault.

    sarcasm There is no way to fix any of this! /sarcasm ..
    :cool:

    While STO does, under certain circumstances, over-utilize hardware on some computers (with a common thread that has yet to be identified), let's not overstate the issue. STO will still run on a fairly broad array of hardware. It runs just fine on machines with single-core CPUs from the Netburst/K8 generation, and will function adequately on almost any SM3.0 capable GPU in existence, including the latest generation of integrated GPUs. That's very impressive for this game. People need to remember that this isn't World of Warcraft here; this game features a lot more eye candy, from HDR lighting to ambient occlusion to relatively high-polygon objects, and even with these turned down or off (minus the last one), the game is still going to be relatively demanding.

    There is an issue with over-utilization, but as you point out, on a properly-cooled system, this should be no issue, especially as it doesn't keep the game from being run on an impressive array of machines (though God help you if you have the over-utilization glitch on a Netburst-based CPU :eek::D:eek:).
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Scay wrote:
    I didnt say 100 FPS will damage your card, I said 100 FPS is an absolutely useless framerate that should never be produced. STO will produce FPS in the 1000th which may damage a weaker system though.

    Scay, Catamount stated why your statements are wrong. It's not a matter of opinion, it's a matter of fact. Processors and chips are not damaged by processing full loads for even extended periods of time. If what you state is true then all processors would have process thread kills to limit it from getting a full load. :rolleyes: And that's just silly.

    Games can't damage your hardware, but defective hardware can corrupt files. It's a one-way street.
    Catamount wrote: »
    Scay's logic seems to work something like this: STO is made to run on the PC platform, and grandma's Celeron-based Emachines PC with a Geforce 6150 is PC, therefore STO must be made to run on grandma's computer. It's a classic undistributed middle fallacy.

    Indeed.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Spots wrote:
    Turning Vsync on is more like doing /maxFPS 60 (if your monitor has a 60hz refresh rate). A dev posted elsewhere that /perframesleep isn't the same as /maxFPS. (or vsync).

    My guess is the perframe sleep adds wait cycles to each frame, letting it cool off. vsync and maxfps try to force the game into a more regular refresh rate, which may add a lot of waits if it's not busy, or none if it's in a busy area.

    My guess would be the /perframesleep would help cool the card in more general circumstances, but might slow the game more. But Im not a dev, so can't be sure.

    However, swapping my new ATI 4890 out for older, much bigger two nVidia 8800 GTXs solved my overheating problems (with bigger, more heat producing, more air blocking hardware?) and it runs faster. Like it or not, I think this indicates STO has some ATI unfriendly code in there, in some manner.

    Thank you. I suspected they where different. I tried vsync on and the /perframeratesleep together and it does run a bit cooler. I'm running between 67 and 73C now solid. No lockups or wierdness on my GTX285. I shifted some things around and increased my cooling as well. When I started I was running from 79-85 and getting the odd lockup now and then.

    Personally I like the fact this particular software drew attention to a flaw in my system where it really did need a little extra cooling. Nvidia really needs to do a better job on auto adjusting the fan speed as temp goes up. If it was not STO it would have been another program that pointed this out. I always try and balance cooling with noise and I had sacreficed a little to much cooling for less noise. Looking at my fan specs, I think I can get the same CFM for half the noise if I pick up a few nice Silenex case fans.

    Your average user probably never messing with their case cooling or cleans it out for that matter. I would not have configured this software the way they have with that in mind. I'd remove the vsync button, leave it on and be able to override it with a software switch for the folks who like to tinker and tweak.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    CapnScragg wrote: »
    Thank you. I suspected they where different. I tried vsync on and the /perframeratesleep together and it does run a bit cooler. I'm running between 67 and 73C now solid. No lockups or wierdness on my GTX285. I shifted some things around and increased my cooling as well. When I started I was running from 79-85 and getting the odd lockup now and then.

    Personally I like the fact this particular software drew attention to a flaw in my system where it really did need a little extra cooling. Nvidia really needs to do a better job on auto adjusting the fan speed as temp goes up. If it was not STO it would have been another program that pointed this out. I always try and balance cooling with noise and I had sacreficed a little to much cooling for less noise. Looking at my fan specs, I think I can get the same CFM for half the noise if I pick up a few nice Silenex case fans.

    Your average user probably never messing with their case cooling or cleans it out for that matter. I would not have configured this software the way they have with that in mind. I'd remove the vsync button, leave it on and be able to override it with a software switch for the folks who like to tinker and tweak.

    A few Geforce cards really do tend to run rather hot, though, to be honest, 85C shouldn't be too bad for a GTX285. That said, if you got that down to the low 70s, then that's very good, because that's that much less heat that builds up inside your case. You're definitely right about fan speeds though, at least from a few vendors, as they seem to go too far to keep things quiet instead of cool. My Radeon HD 5770s never get out of the 60s except in Furmark, which gets them up to the 80s, and that's fine, because I wouldn't worry about these GPUs until they hit 100 or so, but I am somewhat dismayed that even in the mid 80s the fans don't spin up to 100% (they hit about 90% at 86C, which is the present maximum I've gotten on these cards). I don't really want my GPUs waiting until they hit 90-95C before they go full blast.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Catamount wrote: »
    A few Geforce cards really do tend to run rather hot, though, to be honest, 85C shouldn't be too bad for a GTX285. That said, if you got that down to the low 70s, then that's very good, because that's that much less heat that builds up inside your case. You're definitely right about fan speeds though, at least from a few vendors, as they seem to go too far to keep things quiet instead of cool. My Radeon HD 5770s never get out of the 60s except in Furmark, which gets them up to the 80s, and that's fine, because I wouldn't worry about these GPUs until they hit 100 or so, but I am somewhat dismayed that even in the mid 80s the fans don't spin up to 100% (they hit about 90% at 86C, which is the present maximum I've gotten on these cards). I don't really want my GPUs waiting until they hit 90-95C before they go full blast.

    85C is within the operating temp of a GTX 285 but... I was getting odd lockups that I suspected where related to the temp. Honestly though they patched the game so many times the last week I will never know for sure.

    That's great your ATI is running that cool. I know the 4850 cards where running quite a bit hotter. I think the 4850 I had idled around 70C. I'm a self professed Nvidia fan boy but now and then if I find a deal on an ATI card I'll get one and play with it a few months.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    I run my game on a Ati 4870X2 and my card is almost always trying to fry itself when running STO.
    I used to run everything on max but the game crashed a lot of times so I set the graphics back to recommend settings. It is running fine now.

    ( there is no way my case is diry cause A its new and B its a Silverstone Raven Rv01 so the motherboard and al cards are positioned different from normal ATX sollutions. )

    I will try and see if I can run the game at full power when the game goes live though.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    I run my game on a Ati 4870X2 and my card is almost always trying to fry itself when running STO.
    I used to run everything on max but the game crashed a lot of times so I set the graphics back to recommend settings. It is running fine now.

    ( there is no way my case is diry cause A its new and B its a Silverstone Raven Rv01 so the motherboard and al cards are positioned different from normal ATX sollutions. )

    I will try and see if I can run the game at full power when the game goes live though.

    The 4870 X2 is also one of the hottest and most power hungry video cards with dual GPUs.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    The 4870 X2 is also one of the hottest and most power hungry video cards with dual GPUs.

    I could never quite wrap my mind around SLI/Xfire or x2 solutions (from either ATI or Nvidia).. I could never quite justify the total cost in terms of extra power, cooling, noise, and frustration from shoehorning applications made mostly for single GPU systems into the solution vs the amount of perceivable performance you gain. But hey, whatever floats your boat I guess. :)
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    CapnScragg wrote: »
    I could never quite wrap my mind around SLI/Xfire or x2 solutions (from either ATI or Nvidia).. I could never quite justify the total cost in terms of extra power, cooling, noise, and frustration from shoehorning applications made mostly for single GPU systems into the solution vs the amount of perceivable performance you gain. But hey, whatever floats your boat I guess. :)

    Well, I have thoroughly enjoyed my NVIDIA GTX 295, which is also a dual GPU card. I hate SLI and Crossfire (been there, done that) considering the costs of upgrading two cards and the warranty replacement of one that might break SLI/Crossfire if you're given a newer model. NVIDIA's GTX 295 and ATI's 5970 dual-GPU cards are among the first to actually do dual-GPU right. Both of these cards consume only a fraction more power and produce only a fraction more heat than single GPU cards of the same class. In the past, dual GPU cards typically meant dual PCB cards with one connector for the motherboard. Those always produced an excess amount of heat. These dual GPU cards have internal SLI/Crossfire bridges.

    The 4870 GX2 was also the first to be dual-GPU on one PCB. But it wasn't downclocked, tweaked, or anything from its single GPU cousin, and thus, produced much more heat. The GTX 295 and 5970 were tweaked to work more efficiently as a dual-GPU card. Don't get me wrong though, as with adequate case cooling and such, the 4870 GX2 is a very powerful card.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Despite my tendency to shy away from SLI and Crossfire only a short while ago, implementation and performance scaling have vastly improved with recent technological development to make those options much more appealing, making many of these horrid memories of past implementations (my own included) something that is no longer consistent with the present situation. I, for one, am extremely happy with my two Radeon HD 5770s in Crossfire X.

    Most of the early headaches of discrete, multi-GPU setups is now gone. SLI no longer requires dedicated "master" and "slave" cards to be purchased, just as Crossfire no longer requires the unwieldy specialized cables that went to the power supply in early implementations (though many cards still require a dedicated line for power). Power consumption on GPUs in general has also been decreasing due to more efficient fabrication and design, making the prospect of multiple cards a far less monstrous one then it was; the inordinately explosive growth of GPU power consumption in the face of ever increasing efficiency of other components represents a design disparity that we no longer have to tolerate, a fact that can be solely credited to Ati and AMD thanks to their advancement beyond the traditional Monolithic-GPU-of-Doom paradigm with the Radeon HD 4000 series.

    Reducing the headache even further is the fact that only SLI is broken by slight differences in cards. CrossfireX is so tolerant that you can match cards of completely different models as long as they're of the same series (and you're not crossing cards on opposite ends of the spectrum), the caveat being that the faster card gets down-clocked to match the slower. It's a pointless thing to do most of the time, but it goes to show how tolerant Crossfire's second generation implementation is of differences, and is quite handy if you have, say, a 512mb card and want to do two GPUs with a 1GB card, because you can then use the frame buffer of the larger card (SLI/Crossfire do not use memory from both cards, though single-PCB solutions do at the expense of making other concessions).

    They're also a performance/price aspect to it. Generally speaking, it's not worthwhile to implement multi-GPU setups with top-end cards, because most games don't require that kind of horsepower, and you're spending money that could instead just be saved and invested in a newer, faster card a year or two down the road. That said, the performance scaling of SLI and Crossfire have improved enormously with newer generations of chipsets, as mediocre 30-50% gains have turned into a near linear performance increase with dual-GPU setups (and tri-GPU setups are scaling better and better every year). This creates a situation where it is now feasible and advantageous to use mutli-GPU schemes with mid-range GPUs. For a mere $260, one can put two Radeon HD 5750s in Crossfire and have a setup that handily outperforms a single 5850 in most instances, and for a mere ~$310 for 5770s, one can have a setup that easily outperforms a single Radeon HD 5870 in most instances.

    This review provides a reasonably good outline of the present situation. Note that by this point in time, most DirectX9 titles are no longer appropriate benchmarks for the highest end GPU setups, as the ties in perfomance between the CrossfireX 5770, 5850 and 5870 setups shows that these games are now CPU-limited, even on the reviewer's massively powerful Core i7 965. Nevertheless, it is clearly visible in most games (particularly those that are GPU-demanding enough for it to matter) that Crossfire setups offer huge performance advantages, and a very interesting prospect with the midrange cards.

    Of further note is the aforementioned Tri-CrossfireX setup with the 5770, which, while clearly not well-supported by present profiling as most games saw no benefit from the added card (and actually show a decrase in FPS), is very interesting in its performance in the small handful of titles which support it, as the three 5770s (which can be had for a mere ~$470) roughly match the performance of two Radeon HD 5870s (which cost roughly $800-$850). While quad-GPU setups are still a long-ways away from seeing any kind of significant support, baring dual-PCB, quad-GPU setups such as dual GTX295s and 5970s (which show very good scaling), mostly due to the fact that DirectX10 actually ignores the 4th GPU (which DirectX11 hopefully does not), it is clear that just as dual SLI and Crossfire setups are starting to become appealing in with recent development, Tri-card setups may very well become a very appealing option down the road as further development creates better and more ubiquitous profiling for games (almost all games that would actually see benefit have profiles for dual cards these days, barring a title or two such as Supreme Commander).


    Of final note is the benefit of sheer redundancy offered by dual-card setups. Having one GPU fail is no longer a crippling event, as one has another to work with while filing and going through an RMA. In my case, the situation proved even simpler. One of my 5770s has faulty memory, but as only one card's frame buffer is used in CrossfireX, I needed but to switch the cards around and see the problem instantly resolved.

    SLI and Crossfire were, indeed, once a headache that provided little benefit outside of bragging rights. The more time goes, on, however, the more there is a place for such setups in higher-end gaming machines, as the option adds a level of flexibility and choice not previously present in GPU purchasing thanks to improved implementation.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Catamount wrote: »
    Of final note is the benefit of sheer redundancy offered by dual-card setups. Having one GPU fail is no longer a crippling event, as one has another to work with while filing and going through an RMA.

    I can certainly attest to that as well. I used two 8800 GTX cards at one point. One died after 1.5 years, and during the RMA I could still run on one to get me by.

    But I wouldn't use that as an argument to use SLI. For my GTX 295, I have EVGA's extended RMA plan, where they'll send me out a replacement GTX 295 if anything happens to mine. That way, less down time for me. But still, for single card people, just keep your old card if it still works (don't sell it!). That way you have a backup when your card dies while you are handling the RMA process.

    My biggest gripe about SLI for this RMA situation is that companies many no longer have your card in stock if it's an older model. So if you need a warranty replacement, they may give you a different, newer model. This might be fine for CrossfireX, but it completely cripples SLI. And companies won't take back both cards to replace both with equal models unless you want to lie and claim that the second is defective too. I prefer not to do that, so I avoid SLI.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    You do have a point about keeping your old card around. Mine generally go to hand-me-down systems that friends own (my last full system went to a friends' little brother so he could play WoW smoothly with some eye candy). That said, if something catastrophic should happen, I have a PCIE Geforce 7300LE that I got for like $20 sitting in a box just so I can run my machine. It wouldn't run much in the way of games, but it would keep the computer internet and work capable at least so that I wouldn't be without a machine altogether.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Catamount wrote: »
    You do have a point about keeping your old card around. Mine generally go to hand-me-down systems that friends own (my last full system went to a friends' little brother so he could play WoW smoothly with some eye candy). That said, if something catastrophic should happen, I have a PCIE Geforce 7300LE that I got for like $20 sitting in a box just so I can run my machine. It wouldn't run much in the way of games, but it would keep the computer internet and work capable at least so that I wouldn't be without a machine altogether.

    Hahaha, same here in that my old hardware goes to other PCs. If I really need a backup to my GTX 295 (if it dies in the future), I can pull one or two 8800 GTX cards from my second PC to get back and running. And my third PC has two 7600 GT cards in SLI. I even have a 7200, 7300, and 8500 el-cheapo card around for testing (I fix other people's PCs a lot). I also use a 9600 GT card as a second video card in my PC to run a 4th monitor and a Wacom Cintiq tablet, so I could just switch over to that card for primary gaming temporarily.

    Lots of options when you're a geek/PC-modder. ;) For other non-geek folk, I just recommend keeping their previous card provided they didn't wait 5 years between upgrades.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Haha, yeah I used to have a spare of every part imaginable, but then I moved from NH to NC and had a "garage sale" of sorts, where I just gave stuff away more or less free just to clear most of it out so I wouldn't have to take it. That's actually where I scrounged up enough parts for a full PC for the aforementioned little brother of my friend. The only thing I didn't have was a case and a couple other parts (HDD, etc), so I Macgyvered it all into an old Compaq case (using electric tape to connect the cut-and-separated wires for the power switch and HDD light that had been part of a proprietary connection :D). I also had to build a heat sink mount from whatever I could find, because the on the motherboard was a custom mount for a heatsink that had broke, so I just screwed some random metal bracket using available TRIBBLE holes down onto a heat sink's lower piece that made contact with the CPU to hold it in place, and a year and a half later nothing has overheated :D

    The whole thing was ridiculously improvised. Still, an Althon 64 3700+, 2GB of DDR 400 and Geforce 8800GTS 320mb isn't bad for the ~$100 they had to spend filling in the part gap.

    That's pretty much the story of where all my spare parts went...
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    I do still have a bunch of CPUs floating around for some reason, including one of these badboys :D
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Catamount wrote: »
    I do still have a bunch of CPUs floating around for some reason, including one of these badboys :D

    LOL! Nice. :) Those old ceramic chips are tough to find these days, but many are still working.

    I had so much hardware cr*p laying around that I had to buy a ton of ugly plastic storage drawers/bins to sort and keep it all. Still giving away 386 through Pentium III hardware.

    Ever see my retro PC build?
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    LOL! Nice. :) Those old ceramic chips are tough to find these days, but many are still working.

    I had so much hardware cr*p laying around that I had to buy a ton of ugly plastic storage drawers/bins to sort and keep it all. Still giving away 386 through Pentium III hardware.

    Ever see my retro PC build?

    Did you use enough thermal compound on that Pentium CPU?! :eek: :p

    That's so awesome though...
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Catamount wrote: »
    Did you use enough thermal compound on that Pentium CPU?! :eek: :p

    That's so awesome though...

    Thanks. And LOL, well ceramic is different than aluminum heat spreaders. As you know, you should always use a very, very thin coating for newer CPUs with aluminum heat spreaders for better performance. But for ceramic, it's very porous and will absorb most of that thermal compound paste over time. Also, ceramic is pretty close to thermal paste for heat conductivity -- unlike aluminum -- so having excess isn't as big of a deal as it is for modern CPUs.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    I see, that's very interesting. I suppose given the operating temperatures of the CPUs of the day, it mattered a little less to have the type of thermal conductivity that's required for today's chips, hence the passive heat sink.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Speaking of thermal paste, perhaps you could answer a question. While it's generally no issue, I was concerned about the heat my video cards produce in Furmark (~87C), because unlike most places I've lived, NC tends to have very hot summers, and this room can get into the low 80s F at times.

    Do you think I might be able to improve thermal performance by replacing the thermal compound/pad on my GPUs with some AS5?
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Catamount wrote: »
    I see, that's very interesting. I suppose given the operating temperatures of the CPUs of the day, it mattered a little less to have the type of thermal conductivity that's required for today's chips, hence the passive heat sink.

    Yup. Remember the old 286 and 386 chips, of which most did not have any heatsink at all? :) 486 CPUs brought an end to that. I remember seeing the first fan/heatsink combos with 486 and Pentium chips that were overclocked back in the days. Glad we're past the days when overclocking meant changing jumpers or soldering a wire or two or penciling-in some traces.

    Talking about overheating, those overclocks back then could literally blow the ceramic right off the CPU.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Yup. Remember the old 286 and 386 chips, of which most did not have any heatsink at all? :) 486 CPUs brought an end to that. I remember seeing the first fan/heatsink combos with 486 and Pentium chips that were overclocked back in the days. Glad we're past the days when overclocking meant changing jumpers or soldering a wire or two or penciling-in some traces.

    Talking about overheating, those overclocks back then could literally blow the ceramic right off the CPU.

    Haha, yeah it was pretty bad, especially with some of the 3rd party chips (Cyrex and AMD knockoffs tended to obscenely hot no matter what :))

    Of course, a lot of that is before my time, as I didn't come in until the very end of the 90s. Most of my experience is in later play with a lot of this older stuff. My high school's help desk was a utopia of old parts. In fact, the fastest things they were able to scrounge up for workstations for me when I volunteered were a PIII500 for Fedora and a 750mhz version for Windows XP. The tech guy who headed it once jokingly asked if I'd like to see the "biggest" hard drive he had, and when I said yes, he pulled out a giant unit that went into a 5.25' bay :D
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Catamount wrote: »
    (...) he pulled out a giant unit that went into a 5.25' bay :D

    Hahaha! Those were beasts. They advertised hard drives in those days by how many floppies worth of data they could hold. ;)

    The first hard drive I owned was a 30 MB model.

    But here's an older 10 MB model! In those days, the hard drives make so much noise and vibration that they had to suspend them with large rubber mounts.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Hehe, yep, that looks about right. The one I saw was 20MB.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Great minds.... I keep enough spare parts to slap together one or two complete systems. I love my games so every 2-3 years I usually end up replacing my video card. So, I have a nice 8800GTX sitting in my spare parts pile just in case.

    I actually had to use my spare PSU recently as my primary died in a glorious smokey loud bang. On the RMA form for reason for return... unit exploded.

    All good points about SLI/Xfire. My next video card upgrade I might consider an x2 card. We will see, pretty happy with my GTX 285 at the moment.. :)
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    If im just overlooking, or a post i made in here was conveniently deleted. It was in regard to an issue with the game, and it heating up graphics cards. I have a BFG GTX 285 OCX, inside of a Thermaltake Armor Full Tower case, with 2x120mm front fans, 2x120mm rear fans, and a 25 CM side panel fan. I run crysis under full load and i hit 70c and sometimes up to 74c.
    I go to the character selection screen in this game, and my GPU is already at 70c. Something isnt right about that. And no, its not due to insufficient cooling, nor is it due to excessive dust or etc. Someone posted something about frying an 8800GT in the game, thats pretty rough. I have an 8800GT in my backup computer, and those bad boys are rock solid.
    I would have to say, that there is something in the game that needs to be fixed. It's an awesome game, and by far the best looking star trek game Ive ever seen.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    If im just overlooking, or a post i made in here was conveniently deleted. It was in regard to an issue with the game, and it heating up graphics cards. I have a BFG GTX 285 OCX, inside of a Thermaltake Armor Full Tower case, with 2x120mm front fans, 2x120mm rear fans, and a 25 CM side panel fan. I run crysis under full load and i hit 70c and sometimes up to 74c.
    I go to the character selection screen in this game, and my GPU is already at 70c. Something isnt right about that. And no, its not due to insufficient cooling, nor is it due to excessive dust or etc. Someone posted something about frying an 8800GT in the game, thats pretty rough. I have an 8800GT in my backup computer, and those bad boys are rock solid.
    I would have to say, that there is something in the game that needs to be fixed. It's an awesome game, and by far the best looking star trek game Ive ever seen.

    As said many, many, many, many, many, many times by a number of people deserving an equal number of "many"'s, STO can do no more than max out your GPU, which while surprising for a game, should still be doing nothing that your system can't handle.

    Crysis is not some end all, be all graphical application. It doesn't come anywhere close to maxing out a GPU, so whether or not your system can "run Crysis on high" is really pretty irrelevant. Just because Crysis has beefy system requirements doesn't mean that its engine is especially prone to generating heat. When a system is struggling to run Crysis, it simply runs slowly, not hot.

    I think your particular GPU can probably run for sustained periods on the mid 80s at least, so until you start hitting 90 or so, don't worry about it.
  • Archived PostArchived Post Member Posts: 2,264,498 Arc User
    edited January 2010
    Stormnut wrote: »
    and now for the budget solution to this:

    My gaming rig is having this issue, my fix was to open the case up, take a 10 dollar Walmart small floor fan and have it blow outside air over the motherboard and CPU. This cooled my temps down by 20c when running the game at max settings and have not had this problem since. Is is a stop gap solution until I can get a new heatsink and fan for my CPU.


    Epic WIN. Ghetto fabulous aka Redneck Engineering at it's finest!
This discussion has been closed.