HomeBenchmarksBatman: Arkham Origins Benchmark Milos M. October 25, 2013 Benchmarks Previous Batman predecessors Batman: Arkham Asylum and Batman: Arkham City had chronological order, this title however does not take place after Arkham City, but 5 years before the 2009 title Arkham Asylum, where batman is a greenhorn superhero with an attitude. Since developers nowadays are not familiar with chronology you will often find your self in weird situations where offspring are predecessors to their ancestors. Big hype has been surrounding this game, Warner Bros has employed top-dog voice actors for this game, Roger Craig Smith as Batman (also known as the actor who gave voice to Ezio Auditore from the Assassins Creed franchise) and Troy Baker as the Joker, a rising star who has made a name for him self voice acting in many games, such as Bioshock Infinite and The Last Of Us. If Warner Bros delivered what they marketed is another question and it may be addressed in some of our future articles. But enough about the game, this is a benchmark . Since 2009 Nvidia has always worked close with the developers of Batman, whether it was RockSteady Studios or WB Montreal, Nvidia was always there (and now here) to support this game with the green company’s features such as PhysX (features Batman series had since the beginning) and TXAA. Since this is one of those “The way it’s meant to be played” titles, someone would expect that the game will run better on Nvidia cards, lets find out how right that someone would be. Batman : Arkham Origin Benchmark Pages Table Of Contents Introduction System Requirements Test Systems 2560×1440 MSAA 8x 2560×1440 FXAA High 1920×1080 FXAA High System and Video Memory Usage Performance Improvement with Catalyst 13.11 Beta v6 Amd Drivers Conclusion FlameWater Go to steam forums and look at forum topics of people trying to figure out why 5 year old tech and intergrated graphics can’t max out LOL Matt That’s any and all forums concerning PC gaming… zpoccc annoying that the more technically powerful gpu actually performs worse. thankfully this type of thing wont last (at least in nvidia’s favor) once the next gen is in full swing and all games are coded to be AMD optimized by default. Matt Negative. More DX11 optimized games are coming and not just AMD optimized outside of games supporting Mantle… Hardware agnostic coding will still remain at the top for the majority of the PC market. There are still DX11 optimizations that haven’t been utilized yet. zpoccc the bottom line is, developers who are releasing next gen console versions of games are going to be spending a lot of time optimizing for the AMD gpus in those consoles. it seems obvious that that effort will translate to better optimization for AMD gpu equipped PCs. Matt But it doesn’t work that way… Not even mantle allows for complete universal coding between console and PC so optimization is a still dependent on the devs. zpoccc never said anything about “complete universal coding between console and PC”, just that AMD will benefit from optimization done on the console end of things. herpderpherp Nice benchmark. I was wondering whether you had PhysX enabled as well? How much of a performance drop is observed when enabling physX? It would have been nice to know this as well. However if the GTX770 wins against the 7970, with physX enabled on the 770, then its truly awkward. Matt It’s just a rebranded 680 with minor overclocks and a beefed up reference cooler… I see a bigger point with the 8350 keeping up with the I7 4770k. c4toast later 290x beat titan by 350% xD Usman Khan it’s not 350% over competition, read article again, it’s 350% over there own performance before new drivers. it was 13-20 fps before new drivers, now its 40-50.