Best CPU For Gaming – 9 Processors Tested

We decided to conduct thorough testing and try to establish an answer to this never ending dilemma: What is the Best CPU for Gaming ?

“Which CPU should I buy? Intel or AMD? What about overclocking, is there any use of it at all and if there is, is it worth giving extra  money for a more expensive overclockable CPU?”


Best CPU For Gaming Best CPU For Gaming   9 Processors Tested


So many questions that need answering, and answering only one of them is a difficult task.  PC gaming is on its turning point , next gen consoles have been released, previous generation consoles are still here and will probably stick around for a year or two at least.

How does this affect PC gaming you ask , since game developers usually do not bother optimizing games when porting them from consoles. This usually means that games only use 2 CPU cores/threads, or in the best case scenario 4 cores/threads. And yes next gen consoles have a eight core CPU, however if you have read our previous benchmarks you should know that optimization isn’t perfect with next gen consoles and games. As long as old consoles are still here and make the majority of the market, improvements when it comes to optimizing console ports is not in sight. Most users are confused when looking to get a new CPU since its hard to pick a long lasting solution without the fear of wasting money, which is quite valuable in these troubling times. Not an easy time to pick a CPU, then again it never was.

1 2 3 4 5 6 7 8 9 10 11 12 View All

  • Justin

    interesting article…good for those needing a budget build to get the most out of games etc….
    However imo the i7 isn’t a waste of money for gaming…running at 1440p the i7 is nice esp on large multiplayer games…I have had and used just about every chip since 2009 for gaming and the 4770k is hands down a monster 4 core cpu….yes the 4930k is a bit better but totally different budget. Also the 4770k can be found for around 250 or even less so…Anyways thanks!

    • Milos M

      As you go up with the resolution CPU dependency in terms of performance drops and the difference wouldn’t be that noticeable. Thats why fe when people want to test the true CPU power they test it at 720p or even lower, on,lowest possible settings. Games are now days are mostly optimized to use quad core efficiently (such as BF4) while the games such as Metro: Last light drain every CPU to its limits most of the time.

  • protoo

    BF4 -fx 6300 stock performance just like a fx 8320 stock o.0, that´s not right,

    • Milos M

      The data are valid developers at DICE should explain this laziness. Similar story is with i7 vs i5…


    A very well researched and comprehensive article although I hope the author would consider the fact that games are still not optimized for 8 cores and the true potential of FX 8350 cant be judged by most of todays games. Also why the OC was capped at 4.5 GHz? FX 8350 can do much better at higher clock speeds with proper cooling.. Still a very informative reading.

    • Milos M

      It has been considered and acknowledged but that is the performance of the current CPUs. When Mantle hopes along situation will probably get better for FX CPU’s but not in all of the games and it will take some time before mantle has been accepted as a standard or at least as an alternative worth while for the developers. When we decided the OC frequency we went for the most viable option meaning that almost anyone can reach 4,5 Ghz on all of the CPU’s. Our test sample can go up to 4.7 but it requires much higher voltage so its not really worth while same thing goes for i7 4770k whose limit is 4.6 GHz (our sample). In real life situation you can hardly squeeze more out of FX and Haswell CPUs…

    • cubs223425

      Yes, but by the time that 8 cores start getting used, Piledriver will be obsolete. It could even be argued that it already is, given that AMD is launching Steamroller in a few days (but not an FX CPU).

  • Matt

    AMD performance is skewed… an 8 core should not perform less than a 6 core.

    • Nikolas Nikolaou

      It solely depends on game optimization. Most games can’t utilize all cores.

  • cubs223425

    I’m about to start reading the article, but the general test bothers me enough to point out what I think is a meaningful flaw in the system.

    The premise of testing a bunch of these CPUs is great, and I like it. However, I think that most folks aren’t interested in seeing the 4570. I think that more would prefer to see the Ivy Bridge i5-3570K tested against the i5-4670K, to see what kind of difference the generations offer at stock. The same could be said for the i3 choices, since the difference between those two is just a slight clock increase (200 MHz).

    I’d also be intrigued if you had taken out the FX-8320, and instead offered up benchmarks of the high-end Richland CPU or APU, since the 8320 should be able to perform closely to the 8350, since their difference is again just clock speed (though 500 MHz this time).

    I think that an i3, i5, and i7 from both Ivy and Haswell should represent Intel, while AMD should offer up a high-end APU, a high-end Richland-based CPU, the FX-6300, and the FX-8350. Maybe, instead of the two Richland chips, you could have gone with one of them and one of the FX-9000 chips. I just don’t see $13 being a meaningful-enough barrier for gaming purposes, and so the FX-4300 is pretty useless here.

    Then again, that is all just personal opinion based on generational intrigue.

    • Nikolas Nikolaou

      Thanks for pointing out a few of your personal views cubs223425. We are slowly trying to add more systems into the mix so people can see everything side by side and have a better picture on what to get.

      My own personal opinion in terms of testing Intels Ivy vs Haswell or in general any very closely related generation of CPUs is pointless. I have swapped from Sandy to Ivy to Haswell and havent seen anything that great in terms of fps jumps to even document.

      Most consumers stick to the same system for about 2-5 years. We have added the 4570 non K as some people prefer to build a budget system with a cheaper mobo and add a better GPU.

      The New APU CPUs need to be tested as thats a whole different story, good pointer on that.

      We try to add as many price ranges as possible as most consumers tend to not spend alot on hardware.

  • Nikolas Nikolaou

    Hey Sam , I understand your concerns. We have nothing to do with Dice nor EA we due not get “paid” for reviews like most of the sites. Our hardware and games are from our own pockets so I guess thats as unbiased as you can get ( especially when you shell out yourself). With the new Mantle Api 6 core and 8 core optimization will become a reality. When ? we will have to wait and see. I do not understand your rage can you please explain alittle better ?


      There seems to be rampant paranoia and conspiracy theories with gamers recently as I keep reading similar posts on lots of different websites about people being biased .
      In any case I have just found this website today and love it so have bookmarked it in my browser tab.
      This was a good test but I would like you to do another test where you benchmark older I7’s to see if there is any benefit to upgrading if you have older I7’s.
      I have an I7 2600 non K sadly so it’s running at 3.8ghz and since owning it I have upgraded my GPU three times. I bought a prebuilt which was my first ever PC with the 2600 and a gtx590. I then upgraded to the 690 and now I own a 980 but the CPU seems to be good to go for some time yet but I would like to have seen benchmarks just to see the difference anyway.
      Is that something you could do as I noticed that you have various brands of AMD CPU?. You could start with the first I7 right up to newest

      • Nikolas Nikolaou

        I don’t know why we should be biased towards any manufacturer or publisher when we pay full price on all hardware and software like every consumer out there. People should get similar results on the same systems and drivers with very little variance. Its all in the data. I have answered some concerns below.

        • JAGUARCD32X

          I agree, I really like this site and don’t really see what people are basing these accusations on but it seems to be a common theme running across many different websites to squeal “Teh bias” any time someone doesn’t like the results.

  • Yashtir Gopee

    By CPU Performance Value Index, lower is better or higher?

    • Nikolas Nikolaou

      Higher is better. You basically get better bang for your buck.

  • Daniel

    I have an inspiron 570 with alomost every thank upgraded but the processer, I need know wich processer (Gaming) will work with my computer and motherboard.

    Here is a list of parts I have upgraded so far:

    1. Power Supply Upgraded
    2. GPU Upgradeed ( NVIDIA )
    3. Hard Drive Upgraded ( Mutable )
    4. Memory Modules Upgraded (3 out of 4)
    5. Internet Card Upgraded ( T-Link)
    6. Still deciding ( CPU )

    I Need To Know!

    • Milos M

      You need to be much more specific.
      What is your budget?
      What PSU and GPu do you have?
      Monitor resolution?
      And how much memory (in GB) do you have?

  • Ahmed AL-Jaber

    of course no game can utilize too many core cuz it just slows the game down, its different story with consoles since the consoles is HIGHLY optimized with spesific hardware

    • Nikolas Nikolaou

      If the game is optimized to use more cores it doesnt slow the game down, it helps get better and smoother performance on more powerful hardware. You can’t change resolutions and settings on consoles as they are limited by the type of hardware thats available when creating a game.


    I just found this website today and I love it so am going to use this site a lot from now on. I liked this article but what I would really like to see is benchmarks of older I7 CPU’s as I own an I7 2600 Sandy and I would like to see how much the new Devil’s Canyon has improved. I am not looking to upgrade my I7 right now of course as there is no game that gives it any issue. In fact highest recorded CPU usage I have seen is 60% and only for little spikes during Crysis3. On average my CPU is only used around 40% and this is when running games with at least 60fps.
    I think it’s great that CPU’s have long lives which means we only need a GPU upgrade to keep things going

    • Nikolas Nikolaou

      As mentioned previously I swapped from a 2600K to a 3770k then a 4770K, currently on a 3930K due to running multiple monitors/gpus and now 4K . At stock they all provided similar performance.There might be a slight difference vs K processors but only with games that utilize hyperthreading, with a GTX980 it shouldnt be noticeable at all. The higher you go on resolution and settings the less CPU dependant you will be , going to two cards or more you might need to change your cpu though. Its always recommended to run a balanced system (older hardware is not ideal running with newer or more powerful). I would stick with your current CPU though as it still has some life in it , you can always overclock the base clock if you have a decent motherboard when and if you feel that its not enough.

      • JAGUARCD32X

        Thanks for the reply. I was just wondering how long a life my CPU should have as I have went through three GPU upgrades in this time but am not noticing GPU performance lower than website benchmarks using the same GPU’s and settings.
        Obviously that’s a good thing as it’s one less component and motherboard to upgrade but I am just wondering how long it’s going to last.