Performance Benchmarking Assassin’s Creed Unity


Assassin’s Creed Unity (ACU) is a new release from UbiSoft which has not had a very successful release due to serious technical issues that made the game unplayable by most, and although several patches have been delivered many are still complaining about performance and frame rate. The main complaints seem to be around significant pauses (hang ups) and/or low frame rate (frames per second). Those that are pondering whether to buy the game and want to understand what they can expect on their hardware seek help in the forum on Steam only to be told “you just have to try it and see” or “it doesn’t run well on anything”. Those that have bought the game and want to understand why performance is lacking and would like to know what component should be upgraded are being told “buy a new computer”. In order to understand what compute resources the game is most dependent upon, I ran some benchmarking tests and diagnostics, and the results even surprise me.

In a nutshell, my findings are:

  • The game obviously requires a powerful GPU
  • The game does not require more than 8GB RAM, and doesn’t seem to use more than 6-8GB even if there is a huge surplus in the system.
  • CPU horsepower requirement is not terribly high, and there is not a huge correlation between frame rate and CPU compute power.

For this testing, I used a system that is well tested and known to be stable with the following hardware:

  •  Motherboard: ASUS P8Z77-V LK
  • CPU: Intel Core i7-3770K running at 3.5Ghz (single processor with 4 cores and 2 threads per core)
  • RAM: 24GB Kingston DDR3
  • GPU: MSI NVidia GTX-760 GPU
  • Disk System: OS is on an SSD drive, game binaries are on a hardware RAID comprised of three (3) SATA 6G/s drives.
  • OS: Windows 7 Professional 64-bit
  • Video Driver: NVidia version 347.09, driver only (NVidia Gaming Experience not installed)
  • Displays: Three 1080p LCD displays, not configured as a single display with NVidia Surround.

To perform the tests, I ran the game in windowed mode at 1920×1080 one one monitor, and on another monitor I ran windows performance monitor (part of Windows 7) to track CPU utilization, memory utilization, CPU frequency %, page faults, and processor queue length. On another monitor I ran FurMark OpenGL Stress Test (open source from http://www.geeks3d.com), but used the “CPU burner” to drive CPU utilization up to starve the game of CPU bandwidth. Lastly, I used FRAPs to quantify frame rate.

I ran the game in “low” graphics quality setting versus “Ultra”, and used CPU Burner to see what would be the affect of having less CPU power available to the game.

Running the game in “low” graphics settings, the frame rate was consistently in the 40-50 fps range, sometimes dipping to 25fps. CPU utilization was about 40% on average. Memory usage was about 4GB and did not fluctuate. There were no “hang ups” where the game pauses for a second or so.

Running CPU Burner with 8 threads did not affect frame rate significantly and did not cause the game to pause or stutter.  However, overall CPU utilization was 100% because both the game and CPU Burner were competing for CPU.

Running the game in “ulta” graphics setting, the frame rate was consistently around 30fps, dipping at times to 20-25. CPU utilization was significantly lower, approximately 20% on average. This makes sense because less graphics processing is offloaded to the GPU. With GPU Burner running, I did experience a few pauses.

The “Ultra” graphics quality preset is defined to have all graphics options set to the highest value (environment, texture, shadows, etc. ) except anti-aliasing is not maxed and is set to MSAA-2X.     The final test performed was to also increase anti-aliasing to the highest setting “TSAA” (Tessellation).

Running the game with all settings maxed, and with CPU Burner off, the frame rate was 25-30 fps on average, dipping below 20fps. There were no pauses or hang ups, but the game was very choppy due to the low frame rate. Turning on CPU Burner didn’t bring frame rate down significantly, but the pausing was frequent and pronounced.

The result of these tests show that increasing memory above 8GB will not benefit performance of the game, and performance is mostly determined by the GPU and CPU. A system with a sub par amount of CPU bandwidth can run the game fairly well if the GPU is hefty (by running with all graphics settings maxed which alleviates the CPU from some of the work). A system with a hefty CPU but lower end GPU can run the game fairly well by turning off anti-aliasing. But a system with sub par CPU and a GPU at the lower end of the gaming spectrum doesn’t really have a chance of running without frame rate dropping into the teens and pausing.

Lastly, I will offer some opinion on how playable the game is on non-ideal hardware. This game is well playable with frame rate averaging in the high 20s. It is not a flight or racecar simulator, and the highest speed is jogging pace. Most of the time the player is standing still. Those that adhere to the rule of thumb that says “fps must be above 60 or it sucks”  and “graphics settings must be maxed or it sucks” don’t stand a chance being happy.

If you want a higher frame rate, run lower graphics settings. The visual difference is not that significant. But if you want the best visuals and can’t tolerate 30fps, you had better be running a GTX 900 series GPU.

Running two GPUs in SLI mode would be a good idea, except there are a lot of reports that patch 4 broke SLI. So don’t run out and buy a second video card in hopes of improving the experience with this game.

Happy Gaming!

Tom C

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.