Call of Duty Advanced Warfare is out, and our PC benchmark performance analysis is ready on Day Zero. This year’s version is the 11th installment in the Call of Duty series which has been developed by Sledgehammer and published by Activision.

COD: AW is the first to have a three year development cycle and is a multiplatform release (PC, Xbox One and PS4) while the Xbox 360 and PS3 versions will be added later and developed by High Moon Studios.

With all the multiplatform releases PC gamers get the bad end of the stick with the PC port problems that plague us on most of the “next-gen” titles. If you remember last year (something unforgettable IMO) where we analyzed the performance of COD: Ghosts , it had a plethora of optimization issues as well as subpar graphics even though it was labeled “next-gen”.

Before we dive into the benchmark data we wanted to point out a few unique features and a few strange things we noted while playing both the single and multiplayer.

Call Of Duty Advanced Warfare Match Making Lobby

Single and Multiplayer Features

The new added features are an Exoskeleton suit (“Advanced” Warfare), which has a jetpack effect (that reminds me of Titanfall) and a few other Crysis like Nanosuit abilities as well as sci-fi drones and hovercrafts.

There’s a single player mode in addition to three multiplayer modes that consist of normal multiplayer with a handful of levels and maps, the EXO survival co-op mode, the Combat Readiness Program (a mix of bots and players to learn before getting into MP) as well as a paintball feature (for underage players and paintball fanatics).

 

COD Advanced Warfare Class Variants

COD Weapon Variants

 

Frame Rate Cap and FOV

The single player mode has no frame rate cap while the multiplayer is capped at 94 FPS. There’s no FOV view in SP while there is in MP with a slider that ranges from 65 to a max of 90.

System Requirements

6GB is a minimum for RAM, a DX11 compatible GPU and a Core i3-530 for your CPU. If any of these do not meet the requirements you WILL NOT be able to load and play the game.

 

Testing Components

COD Benchmark System

CPU i7 3930K 4.3Ghz
Motherboard Asus P9X79 Deluxe
RAM Kingston Hyper-X 16GB 1600mhz
HDD/SSD Crucial M550 256GB
GPU Galax GTX970 EXOC 4GB , MSI R9-290 Gaming 4S
Monitor Asus PB287Q 28″ 4K 60Hz 1MS
PSU Corsair AX1200W
OS Windows 8.1
Drivers Nvidia: 344.48 WHQL, AMD: Catalyst 14.9.2 Beta

 

Benchmark Methodology

As you know benchmarking anything other than the multiplayer would not be right. With players running around on the map as well as gunfire and explosions this was the best way to really stress out our configurations.

On all the benchmarks I played a 12 player TDM on the Riot map and started recording a few seconds into the game for exactly 60 seconds. I only logged results that I didn’t die in so that kill cams didn’t skew the data.

I want to point out that the benchmark results are an indicative reference point as there are too many variables to control, but after replicating the benchmarks a few times the differences seem to be negligible.

We didn’t add any SLI and CrossfireX benchmarks as there aren’t any compatible drivers or profiles.

 

Benchmark Performance Settings

COD Advanced Warfare Advanced Video Menu

COD Advanced Warfare Advanced Video Menu 2

COD Advanced Warfare Advanced Video Menu 3

 

For the settings I maxed out everything but supersampling and turned cache sun and spot shadow maps off. Those two settings created a strange blur that caused inferior image quality on one of the graphic cards.

 

COD Advanced Warfare 1920×1080 Benchmark

Call Of Duty Advanced Warfare GTX970 vs R9-290 Benchmark 1920x1080 framerate

 

We see the GTX970 running slightly better than the R9-290 at minimum frame rate , but all in all they are near identical and above 81 FPS throughout our 1920×1080 runs.

 

Call Of Duty Advanced Warfare GTX970 vs R9-290 Benchmark 1920x1080 frametimes

 

Looking closer even though the R9-290 does lack a little in the minimum frame rate it does actually perform better in terms of frametimes giving a smoother feel to the game.

 

COD AW 1920x1080 Screenshot

Call of Duty AW HD Screenshot

 

COD AW 2560×1440 Benchmark

Call Of Duty Advanced Warfare GTX970 vs R9-290 Benchmark 2560x1440 framerate

Call Of Duty Advanced Warfare GTX970 vs R9-290 Benchmark 2560x1440 frametimes

 

Again with the 2560×1440 results the R9-290 is behind the GTX970 in frame rate but better in terms of frametimes, both above 72 FPS.

 

COD Advanced Warfare 1440p Screenshot

Call of Duty Advanced Warfare 2560×1440 Screenshot

 

Call of Duty AW 4K Benchmark

Call Of Duty Advanced Warfare GTX970 vs R9-290 Benchmark 3840x2160 4K framerate

 

On the 4K resolution we see the R9-290 pulling ahead with a 15-20% difference on both average and minimum frame rate. Surprising to see both cards hold an average of close to 60 FPS  and minimums of just over 45 FPS with all the settings dialled up.

 

Call Of Duty Advanced Warfare GTX970 vs R9-290 Benchmark 3840x2160 4K frametimes

 

On both the GPUs we see that frametimes are within a playable range without any crazy variance. I didn’t feel any stutter or jerkiness on either of the graphics cards.

 

Call of Duty Advanced Warfare 4K Screenshot

COD Advanced Warfare 4K Screenshot

 

CPU Usage

Lowering the resolution to 1024×768 to monitor CPU usage we see all 12 threads of the Intel i7 3930K being utilized. At the 1080,1440 and 4K resolution we see very similar results.

 

COD AW CPU USAGE

 

VRAM Usage

COD Advanced Warfare Video ram Usage

 

Even on the highest AA setting and 4K resolution we see the maximum VRAM usage at 3575MB . For those using different GPUs and lower or higher VRAM the numbers might vary as the game utilizes as much as it can as seen on the HD resolution .

 

System RAM Usage

Call of Duty AW System Ram Usage

 

With the 16GB of Kingston RAM in the system we see usage at 5-5.3GB depending on the resolution. The minimum system requirements of 6GB are about right , but they should not have disallowed loading the game with less. Hopefully a patch will be released like on the previous version of the game.

 

Conclusion

The game seems to be miles better than COD: Ghosts and the data reflects a well-optimized port. I do have to admit that the netcode is pretty bad, I encountered numerous issues with bullets not registering (even though my ping was low). This, of course, is something that will occur on the first day of launch until the game is patched.

There were various glitches while loading menus (due to animations), a few disconnects for no apparent reason as well as long lobby waits for the matchmaker.

 

COD Advanced Warfare MP Screenshot

 

Image quality on weapons, characters and NPCs are polished and look great but building and terrain textures do seem somewhat dull.

In terms of performance the game was silky smooth. For those that enjoy a fast paced action FPS this time around they did it pretty well. We will be waiting for a patch fix for the netcode as well as SLI/CFX to recommend the game.

 
See Pricing
 

If you have any questions about the benchmark please ask in the comment section below and I will be glad to answer.

About The Author

"Overclock and benchmark freak"

  • woodward54

    That ram usage is insane! I don’t think I own a single game that actually makes use of more than 2GB

    • Nikolas Nikolaou

      If its vram you are talking about it depends on your GPU. For system ram anything between 3-6GB is the norm.

      • woodward54

        3-6GB is common for the recommended system requirements. Close to 5GB (on 1080p) is not the norm for the system memory usage of a game. I just went into some games I play, GTA IV 1.2GB, Portal 2 800MB, Just Cause 2 700MB, Metro Last Light 400MB, Black Ops 2 800GB. Unfortunately I don’t have more recent games to test. I believe Battlefield 4 and Skyrim can use closer to 3GB, but I don’t have those games to confirm. The only game I see using close to that is Minecraft.

        I think this might be a misunderstanding. I think that whoever did this benchmark is looking at the system memory usage for the entire system, not just the game.

        • Nikolas Nikolaou

          Yes its for the total system, we use the least applications possible ofcourse. We actually have a few other games we tested system memory usage on. COD Ghosts used 4144MB @ 1080p and 4455mb @ 1440p – AC4 used 3165MB @ 1080p and 3165MB @ 1440p – BF4 used 6700MB on a full 64 man server on both 1080p and 1440p . You can find the relevant benchmarks in our benchmark category.

        • PIRATE or NINJA

          you say 3-6Gb is common. then you say 5gb is not…… but 5gb is in the range you said first so which is it?

          • woodward54

            The “system recommended specifications” memory != memory usage of the game itself (not including all the crap running in the background and various windows processes)

          • lel

            That means 3, 4, and 6.

            Source: It’s not hard to figure this out, xD

  • apoc1138

    Why do you have a min fps of 88 and then a corresponding frametime variance down to 28ms (which means an fps of 35fps)

    One of your sets of charts makes no sense

    • Nikolas Nikolaou

      You are mentioning the GTX970 @ 1920×1080 of what I see, thats the 0.1 percentile meaning that 35fps occured less than 0.1% of the time in the 5437 frames rendered . Framerate is frames per second (average) while frametime analyses is a per frame basis. Its a big topic to explain in a comment I will post the most relevant source for a thorough review of the difference from Techreport http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking aswell as a screenshot of the frametimes for reference.

      • apoc1138

        Minimum isnt an average, it is supposed to be the minimum

        • Nikolas Nikolaou

          The 35fps (0.1 percentile) is neglible as its 1 of the 90 frames in a second (5473 total frames rendered in the 60 seconds) in this specific benchmark run. We provide the frametimes to understand how smooth the game feels between both gpus. And yes the R9-290s frametimes were better its all in the data.

          • apoc1138

            No other site presents their data that way – it doesnt matter which bits of data you think are relevant, minimum should mean minimum, not “lowest average” which is what you seem to be doing, i assume you are doing the same with maximums

          • Nikolas Nikolaou

            We use Fraps output for min,max,avg and use the frametimes from Fraps with Frafs bench viewer for the frametime percentiles. Thats the methodology we use in our latest benchmarks.

            I’m not sure which sites you are mentioning but I think the more variations used between publishers is better for consumers to get a holistic view of how games perform without actually having the game or components.

          • apoc1138

            Fraps records the fps once per second. Using the minimum fps from the minmaxavg file is misleading. Every other serious review site uses frametime analysis as the basis of their min max and average.

            Try having a read of this;
            http://www.digital-daily.com/video/vga_testing_2007/print

            in particular;
            use the FRAPS results file of the type “… frametimes.csv”
            upon transformation of source data of the file ” … frametimes.csv”, calculate the sought-for average FPS values and, if needed, the minimum and maximum FPS.

            There is nothing holistic about incorrectly producing results.

          • Nikolas Nikolaou

            The truth of the matter is how the frame latency impacts the experience, I did a small addition to the methodology for previous benchmarks but some people didn’t understand how to interpret it.

            Guru3D shows minimum fps (per second fps)
            Computerbase uses max fps (per second fps)
            Pcgameshardware uses avg and max (per second fps)

            They all benchmarked the singleplayer mode though, which isn’t as demanding and has a unlocked framerate.

            Its a mixed bag for simple users and power users (like yourself). Most people use the ingame FPS display (per second fps) so there are going to be conflicting opinions.

            Thanks for the great discussion btw.

          • apoc1138

            Guru3D only show average FPS, no minimum
            Computerbase only show average fps, no maximum (and it was multiplayer not single)

            Look at it this way, if you use a “one data point per second” approach, even if the game spent 75% of its time at 30fps, if it spikes to 90fps at the top of every second, then you end up with an average near 90fps, when in reality you spent a lot more time at 30fps – an exageration to be sure

            averaging the frame time file would be more accurate

            Or to put it another way, if the police ping me at 100mph, me saying “but i averaged 60mph, i only did 100 briefly” is not a defence

          • Nikolas Nikolaou

            Its not the top of every second but frames/60

            When mentioning minimum framerate from a percentile basis how would you interpret one graphics card with

            4999 frames rendered at or below 16.7ms (60fps average converted)

            1 frame rendered at 33.3ms (30fps converted)

            Minimum Fps = 30fps

            While a second gpu has a

            4500 frames rendered at or below 16.7ms (60fps converted)

            300 frames rendered at or below 33.3ms (30fps converted)

            Minimum Fps = 30fps

            I think you get the idea on how using a minimum of “frametime converted framerate” is imo somewhat flawed.

            Theres no 1 size fits all analyses if you don’t sit and play the game yourself its all subjective.