Shadow Of Mordor PC Benchmark Performance Nikolas Nikolaou October 8, 2014 Benchmarks Middle-Earth: Shadow of Mordor is a AAA title developed by Monolith Productions and released by WB Interactive. An Assassin’s Creed stealth and Batman Arkham fight hybrid as many have stated. Usually with “next-gen” open world video games our PC systems get taxed pretty heavily, especially our graphic cards. For that same reason it’s built up quite the hype when leaked images of the texture detail settings page started rolling out, stating the maximum settings would need 6GB VRAM or more for optimal gameplay on 1080p. We instantly thought that we might just have the next Crysis … or a very poorly optimised console port. Other than a select few graphics cards (very pricey ones or custom models) there aren’t a lot of GPUs even on the higher end tier from AMD and NVidia that offer 6GB of video ram. This seems somewhat farfetched for today’s standards on 1080p, especially with 1440p and 4K monitors being a turning stone to affordable higher resolution gaming . If we need that much memory for 1080p what would we need for anything higher or additional Anti-Aliasing ? You guessed it, benchmark time it is then. We decided to analyze the performance and gameplay with frame rates, frametimes and VRAM usage as well as an image comparison of the different pre-sets. We want to see what’s really going on with this modified console port and what we could get out of our single and Sli/CFX cards. Testing Components Shadow Of Mordor Benchmark System CPU i7 3930K Motherboard Asus P9X79 Deluxe RAM Kingston Hyper-X 16GB 1600mhz HDD/SSD Crucial M550 256GB GPU 2X Galax GTX970 EXOC 4GB , 2X MSI R9-290 Gaming 4S, 2X Asus GTX680 2GB Reference Monitor Asus PB287Q 28″ 4K 60Hz 1MS PSU Corsair AX1200W OS Windows 8.1 Drivers Nvidia: 344.16 WHQL (GTX970), 344.11 WHQL (GTX680), AMD: Catalyst 14.9 Methodology We ran the benchmarks on a fresh install of Windows 8.1 with the latest drivers for our test system. We use DDU to clear any previous drivers when interchanging GPUs. Throughout each resolution and setting we fresh boot the system. The game comes with an internal benchmark and it’s forgiving to say the least. Counting frame rates on a black screen in excess of 500 FPS on some occasions causes “fake” averages a lot higher than what we see during gameplay. The scene is also very small and not respresentative of the resource usage during gameplay. We did a standard 60 second benchmark run, starting from jumping from the tower at Udun Foothills (where we span after death or continue a saved game) kill the Orcs and Uruks that are under the tower and run through the Black Road towards the ruins. Nvidia Sli and AMD Crossfire are not officially supported (as of yet) so we created some workarounds to get them running. For AMD CrossfireX we created a custom profile for the game in CCC. We set Framepacing to off and CrossfireX to AFR friendly. For Nvidia SLI 680 we added a profile using the Nvidia Inspector Tool for the game with Sli bits compatibility from Fear 3. For the SLI970 setup we used alternate frame rendering 2 as we couldn’t get it working with the SLi bits. The standard game pre-sets are Lowest, Low , Medium, High ,Very High and Ultra. To be able to run Ultra you will need to download the 3.7GB Ultra HD pack. The max frame rate during gameplay is fixed at 100 so if you use something like Fraps UI in-game and have a fairly good system you should easily be able to see this. On the advance video menu in the bottom right corner you can see your total video ram. Strangely on our AMD CFX system total VRAM is doubled even though it is well known that when running multi GPU setups the effective memory is only possible to be mirrored , each GPU using its video memory independently . Even during tests with Crossfire we see a peak of VRAM load double what the normal values would be , a big error on the Monolith development side. We couldn’t be subjective enough without adding frametimes. For those that don’t already know frametime is a factor we like analysing due to frame rates not describing the smoothness of the game on various setups. High frametimes or “stuttering” as most describe it, usually occur when it takes to long to go from one frame to another causing a slight delay in the game. Your frame rate might be fine and above 30 FPS but you feel a slight “pause” or “cut” in the general flow in gameplay. VRAM bottleneck is seen when system RAM usage as well as the pagefile start spiking when at or near the graphic card’s VRAM limitation. Using MSI Afterburner OSD or an equivalent program you can see the VRAM, system memory as well as pagefile resources. You should feel a slight stutter when the latter start increasing. To be able to pass our playability test the 99th percentile of frametimes must average at or lower than 16.7ms , the 0.1 percentile not spiking above 50ms as well as the game not utilizing all VRAM resources. Our own personal experience using these standards ensure no bottlenecks, visual lag or stuttering at 60Hz. For some users higher or lower values may be tolerated depending on hardware setups and monitor refresh rates. Image Comparison We were waiting for a mind blowing visual experience with the VRAM requirements shown in the options menu, but sadly it was not to be. In full HD any option from the pre-sets under high are extremely poor quality . While very high quality and the Ultra pack look a lot better there’s an abundance of jagged lines some people just don’t enjoy having, with no additional Anti-Aliasing settings to avoid the issue. We see a lot of work done on character detail as well as good draw distance but other textures aren’t that great. The game shined on the Medium or higher pre-sets in 4K but reminded us of slightly above average current-gen AAA games. For those that might want to run a better image quality on anything lower than 4K using downsampling will clear all jagged lines , but be careful as it kills performance. Very High and Ultra pre-set had very little to no difference other than the massive performance hit you would get running maxed out settings. 1920×1080 Lowest – Low – Medium Benchmark All combinations of our test graphic card setups seem to handle all the lower 1920×1080 visual pre-sets fairly well with the lowest minimum frame rate of 77 FPS and average frame rates of above 90 FPS. We see no unusual anomalies in frametimes , nor any high VRAM usage. All cards pass our playability test with flying colours. 1920×1080 High – Very High – Ultra Benchmark On the Very High and Ultra pre-sets we see the GTX680 and SLI GTX680 struggling due to VRAM limitations on our reference cards (2GB ). The SLI GTX680 setup seems to have the horsepower but the VRAM is causing a bottleneck and hiccups during gameplay. The GTX970 and R9-290 pass our playability test on all pre-sets. While the multi-GPU setups perform very well at High and Very High they seem to have very low minimums on Ultra due to our unofficial hack for SLi and CFX. When AMD and Nvidia release profiles for the game we should have better scaling and no sudden framedrops. 3840×2160 Lowest – Low – Medium Benchmark We see the GTX680 putting up a fight at the 4K Lowest and Low presets but lacking the horsepower as well as VRAM for anything at or above the medium pre-set. The SLI GTX680 system has the horsepower for Medium or higher pre-sets but seems to hit a VRAM bottleneck , if you have the 4GB variants you should be able to run Medium. For the single R9-290 and GTX970 we see very slight frametime stutter in fights or terrain load on the High preset, it’s more visible of course if you move to higher pre-sets but nothing tragic . 3840×2160 High – Very High – Ultra Benchmark Our single GPUs can’t seem to pack enough punch to handle the settings. We see our CFX R9-290 and SLI GTX970 systems suffer again from our unofficial hacks with strange framedrops as well as frametime spikes at the High and Very High presets. At 4K with the Ultra HD pack the 4GB cards don’t seem to be enough. We see full utilization of our video memory and swapping occurs with system resources . We see a system memory usage of above 6GB as well as a pagefile of over 11GB bringing all our setups to their knees. Conclusion Time to see what all the fuss was about. Do we really need 6GB VRAM in Shadow of Mordor ? To be totally honest yes and no. 1920×1080 Ultra is where you will need roughly 4GB, if you are using downsampling (200%) or 4K with the Ultra pack where the 6GB requirement will be a must. To explain exactly what we mean we noted a few strange things that happened when running the Ultra HD texture pack at 4K and 1080p downsampled (200%). Our highest GPU VRAM cards are 4GB , after touching close to the 4GB limit our test systems memory usage (system RAM) then went up to 5.9GB – 6GB. At that point the page file system started hitting very high values (11+ GB for us) . The swaps and allocation of other memory systems other than the graphic cards cause noticeable bottlenecks as they are slower. Depending on what VRAM you have available the game will automatically allocate as much as it needs, a misconception many people have in various online discussions. For a 6GB card for example on 1080p for Ultra textures there might be a peak of 5.7GB usage while on the same settings and resolution on a 4GB GPU the total allocated will peak at roughly 3.8GB. If that’s not enough it moves to system resources. At 1080p it shouldn’t be that much of an issue but at 4K or downsampling you’re going to need a well-balanced high end system to avoid any bottlenecks. For an enjoyable experience at these settings (if you have a 4GB GPU) you should probably just drop the settings to high. The uncompressed Ultra HD pack and terrain draw performance hits are the only noticeable problems that we encountered, in general it’s a good enough port that can work with various older-gen hardware but sadly sacrificing image quality. We can only blame the terrain draw on the Monolith developers for not adding a loading screen in separate areas of the map or something similar to avoid sudden framedrops when in heavy combat. We have seen this in various open world games especially this early from release and hopefully a solution and patch will be released. See Pricing If you have any questions about the benchmark please ask in the comment section below and I will be glad to answer. Naeem ur Rehman Fake benchmarks gtx 970 don’t even have 4gb of vram and it never go obove 3.5gb Nikolas Nikolaou Actually no, it does access the other memory if forced it’s just slower. The driver tries to restrict it to 3.5GB but when running 4K and high levels of AA it does access the full 3.5GB+0.5GB.