You can. i did it yesterday so i could talk with my fleet and check out stuff on tribble. however, your running two copies of the game at the same time so unless you have a very powerful computer, expect issues. For me my framerate dropped from 60 to about 20.
You can. i did it yesterday so i could talk with my fleet and check out stuff on tribble. however, your running two copies of the game at the same time so unless you have a very powerful computer, expect issues. For me my framerate dropped from 60 to about 20.
Just /renderscale 0.5 the tribble client if your computer is terrible.
Everywhere I look, people are screaming about how bad Cryptic is.
What's my position?
That people should know what they're screaming about!
(paraphrased from "The Newsroom)
Just for reference: A nVidia 560Ti 2GB card can run both clients at full settings (Windowed Maximized mode), at the same time at a resolution of 1680 x 1050
Both clients generally consume about 600~700MB of video ram, so a 2GB card can swing it if you have it. A good idea is to go into advanced options and "Limit CPU usage when inactive." Then when you switch from one client to the other, they'll compete for less resources.
join Date: Sep 2009 - I want my changeling lava lamp!
I did this without any problems during LoR test period. I can launch max 3 instances before my framerate really drops.
I've a RC730 Samsung (6GB DDR3) running.
That's quite the paradox, how could you nerf nerf when the nerf is nerfed. But how would the nerf be nerfed when the nerf is nerfed? This allows the nerf not to be nerfed since the nerf is nerfed? But if the nerf isn't nerfed, it could still nerf nerfs. But as soon as the nerf is nerfed, the nerf power is lost. So paradoxally it the nerf nerf lost its nerf, while it's still nerfed, which cannot be because the nerf was unable to nerf.
Comments
Just /renderscale 0.5 the tribble client if your computer is terrible.
Kirk's Protege.
What's my position?
That people should know what they're screaming about!
(paraphrased from "The Newsroom)
Both clients generally consume about 600~700MB of video ram, so a 2GB card can swing it if you have it. A good idea is to go into advanced options and "Limit CPU usage when inactive." Then when you switch from one client to the other, they'll compete for less resources.
I've a RC730 Samsung (6GB DDR3) running.
I call it, the Stoutes paradox.