Closed Thread
Page 2 of 15 FirstFirst 1 2 3 4 5 6 12 ... LastLast
Results 16 to 30 of 217

Thread: Anyone improved the game performance with just CPU upgrade?

  1. #16
    Rich Aemry
    Guest

    Default

    Quote Originally Posted by Kuldebar View Post
    It's not a wrong decision either way depending on the individual's situation more than anything else.

    Yeah, the 640 was a nicely priced new Quad Core but the lack of a level 3 cache was the one glaring deficiency to many users and reviewers. So, for about 60 bucks more than what I spent on the 640, I can get a better performing chip.
    Rift runs about the same on 3 or 4 cores with the way it handles threading. It was a design choice for future hardware compatibility which just kicks old hardware in the arse but was smart for a game with the shelf life of an MMO.

    Crank the FSB on that 720 to at least 240. Multis don't mean crap in this game. Once you get it to the 3.2-3.5GHz range you are going to see almost identical performance that you will see anywhere in AMD CPU land. Buy a cooler for now if you don't have 3 spare bills. You will be disappointed.

    I was the guy who started talking about the FSB being the significant factor in this game --look in my comments. I have my AMD CPU stuff down. I have had at least one gaming PC with an AMD all the way back to the K6-2. I'm just trying to save you frustration.

  2. #17
    Shadowlander tersagun's Avatar
    Join Date
    Mar 2011
    Posts
    38

    Default

    Thanks for the supportive replies guys.

    Regarding the FSB tweak, I'm already at 240MHz FSB with 2160 HT and NB buss (I have to keep this settings to be able to run my memory at 1600 MHz).


    And regarding the wait about Bulldozer and being a fanbo,;
    I'm rather an Intel hate-boy then an AMD fan-boy. They've earned fortunes by overpricing everything they were selling because they were a monopol. It's everyone's own choice to ignore anything happening outside their own world but I care for those stuff.

    Will that result in a worse PC performance, so be it. I'll play slower with dignity ;-)

  3. #18
    Soulwalker
    Join Date
    Feb 2011
    Posts
    23

    Default

    I went from an intel p4 HT 3.20Ghz to a AMD phenom II x4 980 2.8 Ghz. Yeah, it helped... a lot.


    For the intel vs. AMD debate, the reason why I went with amd was because I've had very good experiences with the brand, ironically I hate ATI cards, but love nvidia cards. I just go with whatever is more of a personal tried and trusted brand. /shrug.

  4. #19
    Sword of Telara
    Join Date
    Feb 2011
    Posts
    878

    Default

    Quote Originally Posted by Rich Aemry View Post
    There is no AMD CPU that can handle this game right now and won't be until bulldozer in June.
    That is really highly subject to your definition of "handle." My CPU may be the ultimate limiting factor on game performance but it's solid enough to manage 40-50 FPS in 80-90% of all situations in the game. I may drop to the 15-20 range for invasion raids (I'm usually too busy paying attention to the mobs to look), but beyond those instances which compromise approximately 10% of less of the game content I experience, and an even smaller percentage of game time played, it handles the game quite well.

    AMD's CPUs may be shafted by the game engine, especially with what appear to be aritficial locks in place on how and how many CPU resources as a percentage are used, but that doesn't mean that they can't handle the game.

    At least so long as we're talking about most any CPU that isn't a Sempron and is newer than the Athlon XP line.

    Quote Originally Posted by tersagun View Post
    Thanks for the supportive replies guys.

    Regarding the FSB tweak, I'm already at 240MHz FSB with 2160 HT and NB buss (I have to keep this settings to be able to run my memory at 1600 MHz).
    Speaking from my own experience with an ASRock motherboard try to keep the NB/HT as close to 2200 MHz at the most as possible. At stock speeds if I set mine to 2400 it doesn't matter how much voltage I pump in Windows won't load properly, and past that I have to set numbers high enough that I don't feel comfortable with them to get it to load Windows. Combine that with the apparently lack of effect on performance yielded by modifying either number and I really see no reason to let them go over 2.2-2.25 GHz. It's pretty much the same reason I'll never advise changing the PCIe speed. As nearly as I can tell it's utterly useless to do so as there's no performance difference between 85 MHz and 115 MHz.

    I'll grant that those issues, and my issues reaching the overclock speeds posted by many other sources over the years, are likely directly influenced by my motherboard, but given the apparent lack of performance enhancement/detraction either way it doesn't make much sense to me to push the speeds as fast as your board will allow.

    Quote Originally Posted by tersagun View Post
    And regarding the wait about Bulldozer and being a fanbo,;
    I'm rather an Intel hate-boy then an AMD fan-boy. They've earned fortunes by overpricing everything they were selling because they were a monopol. It's everyone's own choice to ignore anything happening outside their own world but I care for those stuff.

    Will that result in a worse PC performance, so be it. I'll play slower with dignity ;-)
    Ever play Dragon Warrior 3? If so do you remember the cursed weapon, the multi-edged blade? You had a chance of hurting yourself every time you used it and using AMD over Intel because of Intel's business practices (and let's face it, they haven't been a monopoly in about 11-12 years, even when Cyrix kicked hte bucket leaving only AMD) because it's roughly a 50% chance that AMD will go the "correct" course based on customer support.

    Most every hardware developer out there understands that to remain competitive they need to have the top product or, preferably, several top products. The issue is that by consistantly buying AMD chips for high-end, single application gaming PCs when they aren't the best suited CPU on the market for that your money is used to fund the budgets for low-end CPUs, netbook APUs, and other products you likely will never buy. That's the nature of the beast and there's nothing we can do to change it.

    The problem comes in when a business like AMD decides that since they're struggling to keep up in the gaming CPU side of things they're just going to devote less funding to it and ratchet up the pressure in the other areas. Unless they publish a public statement about this you could end up going a generation or two buying AMD chips for gaming and supporting them before you realize you're wasting your money because they aren't using it to support your needs and wants.

    AMD would, of course, be stupid to do that, especially given that on a comparable generation basis they have routinely managed to bring CPUs to market that match Intel's for output while exceeding or undercutting them in power usage, heat production, and/or cost in many categories including low to mid-range gaming, but it's possible. That's why blindly buying AMD hardware just to snub Intel is just as stupid as buying Intel because it's more expensive. Sometimes the best way to encourage a company to improve their products in one area is to buy a competitor's when it's superior.

    All that said, I don't plan on buying a new CPU until at least when the "Bulldozer" line hits the market. I may not buy AMD's next generation CPU, but I'll wait until then so that I can see some real performance comparisons as well as what its release does to the prices on Intel's Sandy Clam line. It's really the smart thing to do so long as you can run the game at an acceptable frame rate regardless of settings or CPU used.

    If you can't run the game at a level that doesn't actively frustrate you every time you log in (such as with the current in-game audio issues that can't be fixed by hardware) then by all means upgrade, but if it won't give you an ulcer or even cause you to take an extra ten aspirin waiting a few months before upgrading will do noone any harm, even if they use the extra money saved to upgrade to Intel.
    Last edited by Marikhen; 04-03-2011 at 11:48 AM.

  5. #20
    Rich Aemry
    Guest

    Default

    Quote Originally Posted by Marikhen View Post
    That is really highly subject to your definition of "handle." My CPU may be the ultimate limiting factor on game performance but it's solid enough to manage 40-50 FPS in 80-90% of all situations in the game. I may drop to the 15-20 range for invasion raids (I'm usually too busy paying attention to the mobs to look), but beyond those instances which compromise approximately 10% of less of the game content I experience, and an even smaller percentage of game time played, it handles the game quite well.

    AMD's CPUs may be shafted by the game engine, especially with what appear to be aritficial locks in place on how and how many CPU resources as a percentage are used, but that doesn't mean that they can't handle the game.
    2 things:

    1 30 FPS is usually considered minimum playable framerate. I was going off of that, and in a world event or raid Rift an AMD CPU drops below that on Ultra.

    2 The limits aren't artificial. Set your system to 3-4 cores in the core affinity (which BTW will improve framerate) and you will see at 3 cores you will get 90% or so usage, but at 6 you get 30% or so. In performance monitor you see the number of threads generated by Rift is a couple dozen which means that the threads are smaller and more numerous. That would lead one to think they are designed for a CPU with parallel threading tech like hyperthreading or whatever AMD will call what they do with bulldozer which is along the same line of thought. This was a smart decision by Trion to support parallel threading which will be the norm in 3-5 years over 6 larger threads which will never be the norm since the game will hopefully still my on the market.

    This is crappy for the AMD hex core users, but makes sense from a business point of view. I actually would have done the same thing as much as I was annoyed by it.

  6. #21
    Soulwalker
    Join Date
    Feb 2011
    Posts
    12

    Default

    I'm currently running with an AMD Phenom II X4 965(160$ on newegg), and it isn't having any issues at all with RIFT. Even with an XP VM running the background, it only goes to around 60% usage per core in highly populated areas.

    The real bottleneck in my system is the GPU, still running an 8800GT. With fairly very low graphical settings, I get about 15-20 fps in Meridian, and about 40-45 when outside of towns(1440x900 resolution).

    Turning my settings up to ultra, and my cpu still didn't even go above 70% usage(with the VM). AMD cpus can handle RIFT.

  7. #22
    Rich Aemry
    Guest

    Default

    Quote Originally Posted by Kaydance View Post
    I'm currently running with an AMD Phenom II X4 965(160$ on newegg), and it isn't having any issues at all with RIFT. Even with an XP VM running the background, it only goes to around 60% usage per core in highly populated areas.

    The real bottleneck in my system is the GPU, still running an 8800GT. With fairly very low graphical settings, I get about 15-20 fps in Meridian, and about 40-45 when outside of towns(1440x900 resolution).

    Turning my settings up to ultra, and my cpu still didn't even go above 70% usage(with the VM). AMD cpus can handle RIFT.

    That 15-20 in Merdian is CPU related. and like I said you don't get efficient usage of cares until you switch to 3 core affinity.

    That 8800GT is weak too, but is only part of the problem in this game.

  8. #23
    Soulwalker
    Join Date
    Feb 2011
    Posts
    12

    Default

    Quote Originally Posted by Rich Aemry View Post
    That 15-20 in Merdian is CPU related. and like I said you don't get efficient usage of cares until you switch to 3 core affinity.
    I can assure you that it isn't. Closing my VM resulted in 0 FPS gain with less overall CPU usage.

    Also, RIFT has access to all 4 cores, not an issue.

    Moving RIFT to the SSD might have an impact on the meridian(loading lots of different textures from player armor/race/skin/hair/whatever). Will have to try that also.
    Last edited by Kaydance; 04-03-2011 at 02:50 PM.

  9. #24
    Rich Aemry
    Guest

    Default

    Quote Originally Posted by Kaydance View Post
    I can assure you that it isn't. Closing my VM resulted in 0 FPS gain with less overall CPU usage.

    Also, RIFT has access to all 4 cores, not an issue.
    It has access to all of them, but doesn't use beyond 3 standard cores efficiently. If you switch to using 3 cores you will see what I'm talking about.

    The issue has never been SHOULD phenom IIs be the bottleneck because they really shouldn't. The issue is that they ARE the bottleneck, and what to do about it.

    Your VM in the background on an underutilized CPU makes zero difference because an AMD CPU is like everyone says underutilized. Games should in theory use every ounce of processing power a PC has and everything should be running at 100% usage. Now AMD users as well as pre core ix intel users aren't getting 100% usage on CPU or 100% GPU usage.

    I am now getting 100% CPU usage an 100% GPU usage in Rift on my new i7. That combined with the face Rift spawns 35 threads or so would tell you that it's a threading choice that causes the AMD and other legacy CPU problems.

    Wide open terrain is harder on a GPU than Meridian would be. Meridian is a CPU killer because of all the calculations required to place all the players at once.

  10. #25
    Sword of Telara
    Join Date
    Feb 2011
    Posts
    878

    Default

    Quote Originally Posted by Rich Aemry View Post
    2 things:

    1 30 FPS is usually considered minimum playable framerate. I was going off of that, and in a world event or raid Rift an AMD CPU drops below that on Ultra.
    Hence my stating that it really depends on what you mean by handling it. I personally consider the minimum playable frame rate to be around 15 FPS for most games. Some, like EVE Online, are playable at closer to 10 FPS but that's stretching it, and that's also using a game where frequent visual updates aren't necessary, as is evidenced by many players over the years asking the developers for an "overview-only" client that completely eschews 3D graphics.

    Quote Originally Posted by Rich Aemry View Post
    2 The limits aren't artificial. Set your system to 3-4 cores in the core affinity (which BTW will improve framerate) and you will see at 3 cores you will get 90% or so usage, but at 6 you get 30% or so.
    Actually, no, it won't improve my frame rate. I get the exact same frame rate with 3 cores through 6. I've checked. Dropping below 3 reduces the frame rate, a small amount at 2 and a huge, and rather expected, drop at 1.

    Likewise per core usage across three cores only amounts to 70%, not 90%. Even when the primary core hits 90% it's not for long and then the only one. Hell, I've had at least one instance where I saw the primary core's usage drop to about 40% while one of the other two cores in use by the game jumped to roughly 70%.

    Quote Originally Posted by Rich Aemry View Post
    In performance monitor you see the number of threads generated by Rift is a couple dozen which means that the threads are smaller and more numerous. That would lead one to think they are designed for a CPU with parallel threading tech like hyperthreading or whatever AMD will call what they do with bulldozer which is along the same line of thought. This was a smart decision by Trion to support parallel threading which will be the norm in 3-5 years over 6 larger threads which will never be the norm since the game will hopefully still my on the market.
    Last I checked Rift used 27-29 threads. Compare that to 56 threads with World of Warcraft, Metro 2033's 31 threads (and often hitting 40% CPU usage during combat in the prologue), 29 threads in Crysis (x64) with 23-25% CPU usage (35% in CPU test map 2), even the lowly Doom running on the latest gzDoom executable hits 18 threads and 18% usage. Even Mass Effect pulled 31 threads, though at a lowly 23-25% CPU utilization. Starcraft 2 by contrast ran 54-55 threads, though it still only pulled about 20%-25% CPU usage, even with 150 battleships on-screen and fighting. In some games, such as ones with absurdly high supply rates allowing you to field 200+ brood lords, CPU usage seemed to drop to 10% or less.

    Hellgate London (DX10, x64 single player client) was a bit of an odd duck, however, starting at 45 threads and dropping to 39 or so before I quit playing. CPU usage remained steady at around 20% while in combat. Borderlands hit nearly 40 threads.

    Now I'll grant that all of those are FPS or RPGs, not a one of them running in multi-player, excepting for World of Warcraft. Here's some numbers from the F2P MMOs I (very) randomly play.
    Shin Megami Tensei Imagine: 22 threads average, flunctuated a lot.
    Neo Steam: Wow, this one takes the cake. 70 threads.
    Requiem Memento Mori: 23 or so.
    Fiesta: 38.
    Perfect World International: A pathetic 20, even with all the graphics rendering aparently done on the CPU.
    Allods Online: 28 or so.

    In most every case the primary limiting factor is my GPU. The only exception is Starcraft 2 which, for some reason, seems to flatline CPU and GPU usage when you have something like 100 carriers, 200 broodlords, or similarly high numbers of resource-intensive units in play.

    As for the idea of being designed for parallel/hyper-threading, that's a nifty idea if someone hadn't posted awhile back a spiffy picture of this game using all of their logical cores and none of their virtual/fake/HT cores were apparently in use.

    Regardless of the parallel/HT issue I'll stand by my statement that the game artificially limits its usage of available resources. If it needs more output from the hardware it should request it, and it's not. It's requesting the same amount now as it did at stock speeds and the only reason why I'm seeing any performance improvement is because the raw power for that amount has gone up. Other games vary in how much CPU usage they have, lower when they don't need it and higher when they do, but Rift tends to flatline at 30% unless I enable v-synch.

    Quote Originally Posted by Rich Aemry View Post
    This is crappy for the AMD hex core users, but makes sense from a business point of view. I actually would have done the same thing as much as I was annoyed by it.
    Be that as it may it wouldn't be such an issue if it didn't force those people to OC their systems in order to get quality performance out of a game that doesn't actually stress the hardware. The game would have received approximately the same amount of processor output at 37% with stock speeds as it would have at OC'd speeds and 30%. Likewise if that 30% was enough at stock speeds then after OC'ing it CPU usage should have dropped to 25%.

    /shrugs. Either way whether due to design or defect the game is what's degrading performance on my system in this area and it results in an aritficial limit on performance. Even ignoring the flat rate on CPU usage this holds true if what you're saying about how it manages its threads is accurate.

  11. #26
    Ascendant
    Join Date
    Jan 2011
    Posts
    1,484

    Default

    Quote Originally Posted by Rich Aemry View Post
    I am now getting 100% CPU usage an 100% GPU usage in Rift on my new i7.
    It's impossible to have the perfect balance. One will ALWAYS be utilized a little more or a little less than the other.
    2600K@5.3Ghz
    XFX GTX580@960
    28" 1920x1200
    SuperNerd

  12. #27
    Rich Aemry
    Guest

    Default

    Quote Originally Posted by Marikhen View Post
    Hence my stating that it really depends on what you mean by handling it. I personally consider the minimum playable frame rate to be around 15 FPS for most games. Some, like EVE Online, are playable at closer to 10 FPS but that's stretching it, and that's also using a game where frequent visual updates aren't necessary, as is evidenced by many players over the years asking the developers for an "overview-only" client that completely eschews 3D graphics.



    Actually, no, it won't improve my frame rate. I get the exact same frame rate with 3 cores through 6. I've checked. Dropping below 3 reduces the frame rate, a small amount at 2 and a huge, and rather expected, drop at 1.

    Likewise per core usage across three cores only amounts to 70%, not 90%. Even when the primary core hits 90% it's not for long and then the only one. Hell, I've had at least one instance where I saw the primary core's usage drop to about 40% while one of the other two cores in use by the game jumped to roughly 70%.



    Last I checked Rift used 27-29 threads. Compare that to 56 threads with World of Warcraft, Metro 2033's 31 threads (and often hitting 40% CPU usage during combat in the prologue), 29 threads in Crysis (x64) with 23-25% CPU usage (35% in CPU test map 2), even the lowly Doom running on the latest gzDoom executable hits 18 threads and 18% usage. Even Mass Effect pulled 31 threads, though at a lowly 23-25% CPU utilization. Starcraft 2 by contrast ran 54-55 threads, though it still only pulled about 20%-25% CPU usage, even with 150 battleships on-screen and fighting. In some games, such as ones with absurdly high supply rates allowing you to field 200+ brood lords, CPU usage seemed to drop to 10% or less.

    Hellgate London (DX10, x64 single player client) was a bit of an odd duck, however, starting at 45 threads and dropping to 39 or so before I quit playing. CPU usage remained steady at around 20% while in combat. Borderlands hit nearly 40 threads.

    Now I'll grant that all of those are FPS or RPGs, not a one of them running in multi-player, excepting for World of Warcraft. Here's some numbers from the F2P MMOs I (very) randomly play.
    Shin Megami Tensei Imagine: 22 threads average, flunctuated a lot.
    Neo Steam: Wow, this one takes the cake. 70 threads.
    Requiem Memento Mori: 23 or so.
    Fiesta: 38.
    Perfect World International: A pathetic 20, even with all the graphics rendering aparently done on the CPU.
    Allods Online: 28 or so.

    In most every case the primary limiting factor is my GPU. The only exception is Starcraft 2 which, for some reason, seems to flatline CPU and GPU usage when you have something like 100 carriers, 200 broodlords, or similarly high numbers of resource-intensive units in play.

    As for the idea of being designed for parallel/hyper-threading, that's a nifty idea if someone hadn't posted awhile back a spiffy picture of this game using all of their logical cores and none of their virtual/fake/HT cores were apparently in use.

    Regardless of the parallel/HT issue I'll stand by my statement that the game artificially limits its usage of available resources. If it needs more output from the hardware it should request it, and it's not. It's requesting the same amount now as it did at stock speeds and the only reason why I'm seeing any performance improvement is because the raw power for that amount has gone up. Other games vary in how much CPU usage they have, lower when they don't need it and higher when they do, but Rift tends to flatline at 30% unless I enable v-synch.



    Be that as it may it wouldn't be such an issue if it didn't force those people to OC their systems in order to get quality performance out of a game that doesn't actually stress the hardware. The game would have received approximately the same amount of processor output at 37% with stock speeds as it would have at OC'd speeds and 30%. Likewise if that 30% was enough at stock speeds then after OC'ing it CPU usage should have dropped to 25%.

    /shrugs. Either way whether due to design or defect the game is what's degrading performance on my system in this area and it results in an aritficial limit on performance. Even ignoring the flat rate on CPU usage this holds true if what you're saying about how it manages its threads is accurate.
    I think we're just going to wind up arguing until we are blue in the face so to speak agreeing on all but causation. All 8 logic units are getting used at near 100% And I had the same problems as you are now with the same CPU at the same speeds you are at now. I had better results with 3 cores running though I set all cores to equal.

    A phenom IIx4 or x6 is the wrong upgrade for this game was my main point, and I think you would agree with me regardless of reasoning. The OP will get identical performance by just OCing his FSB on his x3. If he want's to upgrade his CPU for this game if he needs to either wait two months or switch to an Intel chip.

    I don't want this to get bitter, I like you (as much as you can someone on a forum.) and I think we both agree on the basic point here. If you got identical FPS with your CPU in x3 mode then why upgrade to an x6?

  13. #28
    Rich Aemry
    Guest

    Default

    Quote Originally Posted by C-BuZz View Post
    It's impossible to have the perfect balance. One will ALWAYS be utilized a little more or a little less than the other.
    Give or take, of course.

  14. #29
    Ascendant
    Join Date
    Jan 2011
    Posts
    1,484

    Default

    Quote Originally Posted by Rich Aemry View Post
    Give or take, of course.
    How are you getting 100% usage anyway? I don't ever see more than 40-50% usage (spread evenly across 4 cores) on my 2600K & usually around 70-90% on both Gpu's.
    Last edited by C-BuZz; 04-03-2011 at 05:26 PM.
    2600K@5.3Ghz
    XFX GTX580@960
    28" 1920x1200
    SuperNerd

  15. #30
    Rich Aemry
    Guest

    Default

    Quote Originally Posted by C-BuZz View Post
    How are you getting 100% usage anyway? I don't ever see more than 40-50% usage (spread evenly across 4 cores) on my 2600K & usually around 70-90% on both Gpu's.
    Fresh install with a decent raid setup and an OCed base frequency (111MHz) and 52 multi. Set the game to primary core 0 so get even distribution, turbo boost is off, and cranked everything. Before I crank AA and AF I don't get full usage of anything (which that is odd even to me.)

    Using Radeon Pro set to supertile as well.

Closed Thread
Page 2 of 15 FirstFirst 1 2 3 4 5 6 12 ... LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts