Star Wars Battlefront – Ripoff of the Year?

star-wars-battlefront-box-art-01-ps4-us-06apr15Every year there is at least one massive failure in the PC gaming industry where a game is released with a long list of serious bugs that make it clear that meeting deadlines to capitalize on market hype and the holiday gifting wave with little to no concern of quality. There have been entire studios driven out of business because of releasing a poor product, with MicroProse’s botched release of “Falcon 4.0” being a prime example. The title “E.T.” for the Atari 2600 goes down in history as being the biggest video game failure of all time and is being blamed for the video game crash of the 80’s. Is “Star Wars Battlefront”, released by Origin (a division of EA), going to go down as the biggest failure of releases in the November-December period of 2015?

Clearly, if you are a game studio that has secured license rights to release the next Star Wars title which happens to coincide with the release of the next film in the franchise, you know you are well positioned to earn serious revenue for your company. So you set out to communicate your vision with the teams you have hired to deliver your vision. It goes without saying that the graphics must be stunning (and that is the case). It should also go without saying that the game should be fun and keep you engaged, and the general consensus is that after about 30 minutes it simply becomes “boring”. It should also go without saying that the game should work, and the bug list is quite long and the severity of the bugs is quite alarming.

The list of common technical problems and workarounds, as of Jan 4th is published here: Common gameplay issues and workarounds for Star Wars™ Battlefront

You can see that pretty much aspect of the game is flawed for all platforms. My favorite is for “Technical hang or game freeze” with the recommended workaround “Restart your game.” What that basically means is if you happen to find the game enjoyable and devote time to playing, it might hang or freeze at any time and you have to restart. Nice.

Game controllers don’t work, there are video problems, and connectivity issues. I bought the game for my 8-year-old son, and after an hour of playing, he was already playing something else. When I asked his opinion of the game he said: “it freezes and glitches too much”. Keep in mind a custom built a gaming system based on an AMD-6300 CPU and nVidia GTX-950 GPU, and whatever the issues are it isn’t the computer.

There you go. Did entire teams of EA not test the product? Did they not do focus groups? Or did they know all the problems but needed to release it despite the serious issues in order to cash in on the Star Wars bonanza?

It is a common tactic to release and hope your audience will be patient and wait for the patches. The problem is that once the game is released the studios downsize their staff. It can take months of patches to get most of your customers happy, as was the case last year with UbiSoft’s “Assassins Creed Unity”.

Please note, “Star Wars Battlefront” are registered trademarks and property of Disney and EA and I have not been authorized to use their trademarks or the image in this article, especially since my review is less than flattering.

Performance Benchmarking Assassin’s Creed Unity


Assassin’s Creed Unity (ACU) is a new release from UbiSoft which has not had a very successful release due to serious technical issues that made the game unplayable by most, and although several patches have been delivered many are still complaining about performance and frame rate. The main complaints seem to be around significant pauses (hang ups) and/or low frame rate (frames per second). Those that are pondering whether to buy the game and want to understand what they can expect on their hardware seek help in the forum on Steam only to be told “you just have to try it and see” or “it doesn’t run well on anything”. Those that have bought the game and want to understand why performance is lacking and would like to know what component should be upgraded are being told “buy a new computer”. In order to understand what compute resources the game is most dependent upon, I ran some benchmarking tests and diagnostics, and the results even surprise me.

In a nutshell, my findings are:

  • The game obviously requires a powerful GPU
  • The game does not require more than 8GB RAM, and doesn’t seem to use more than 6-8GB even if there is a huge surplus in the system.
  • CPU horsepower requirement is not terribly high, and there is not a huge correlation between frame rate and CPU compute power.

For this testing, I used a system that is well tested and known to be stable with the following hardware:

  •  Motherboard: ASUS P8Z77-V LK
  • CPU: Intel Core i7-3770K running at 3.5Ghz (single processor with 4 cores and 2 threads per core)
  • RAM: 24GB Kingston DDR3
  • GPU: MSI NVidia GTX-760 GPU
  • Disk System: OS is on an SSD drive, game binaries are on a hardware RAID comprised of three (3) SATA 6G/s drives.
  • OS: Windows 7 Professional 64-bit
  • Video Driver: NVidia version 347.09, driver only (NVidia Gaming Experience not installed)
  • Displays: Three 1080p LCD displays, not configured as a single display with NVidia Surround.

To perform the tests, I ran the game in windowed mode at 1920×1080 one one monitor, and on another monitor I ran windows performance monitor (part of Windows 7) to track CPU utilization, memory utilization, CPU frequency %, page faults, and processor queue length. On another monitor I ran FurMark OpenGL Stress Test (open source from http://www.geeks3d.com), but used the “CPU burner” to drive CPU utilization up to starve the game of CPU bandwidth. Lastly, I used FRAPs to quantify frame rate.

I ran the game in “low” graphics quality setting versus “Ultra”, and used CPU Burner to see what would be the affect of having less CPU power available to the game.

Running the game in “low” graphics settings, the frame rate was consistently in the 40-50 fps range, sometimes dipping to 25fps. CPU utilization was about 40% on average. Memory usage was about 4GB and did not fluctuate. There were no “hang ups” where the game pauses for a second or so.

Running CPU Burner with 8 threads did not affect frame rate significantly and did not cause the game to pause or stutter.  However, overall CPU utilization was 100% because both the game and CPU Burner were competing for CPU.

Running the game in “ulta” graphics setting, the frame rate was consistently around 30fps, dipping at times to 20-25. CPU utilization was significantly lower, approximately 20% on average. This makes sense because less graphics processing is offloaded to the GPU. With GPU Burner running, I did experience a few pauses.

The “Ultra” graphics quality preset is defined to have all graphics options set to the highest value (environment, texture, shadows, etc. ) except anti-aliasing is not maxed and is set to MSAA-2X.     The final test performed was to also increase anti-aliasing to the highest setting “TSAA” (Tessellation).

Running the game with all settings maxed, and with CPU Burner off, the frame rate was 25-30 fps on average, dipping below 20fps. There were no pauses or hang ups, but the game was very choppy due to the low frame rate. Turning on CPU Burner didn’t bring frame rate down significantly, but the pausing was frequent and pronounced.

The result of these tests show that increasing memory above 8GB will not benefit performance of the game, and performance is mostly determined by the GPU and CPU. A system with a sub par amount of CPU bandwidth can run the game fairly well if the GPU is hefty (by running with all graphics settings maxed which alleviates the CPU from some of the work). A system with a hefty CPU but lower end GPU can run the game fairly well by turning off anti-aliasing. But a system with sub par CPU and a GPU at the lower end of the gaming spectrum doesn’t really have a chance of running without frame rate dropping into the teens and pausing.

Lastly, I will offer some opinion on how playable the game is on non-ideal hardware. This game is well playable with frame rate averaging in the high 20s. It is not a flight or racecar simulator, and the highest speed is jogging pace. Most of the time the player is standing still. Those that adhere to the rule of thumb that says “fps must be above 60 or it sucks”  and “graphics settings must be maxed or it sucks” don’t stand a chance being happy.

If you want a higher frame rate, run lower graphics settings. The visual difference is not that significant. But if you want the best visuals and can’t tolerate 30fps, you had better be running a GTX 900 series GPU.

Running two GPUs in SLI mode would be a good idea, except there are a lot of reports that patch 4 broke SLI. So don’t run out and buy a second video card in hopes of improving the experience with this game.

Happy Gaming!

Tom C

Tom C Forecast: Cloud Gaming Will Transform the Gaming Industry

VGAcardseries2

The cloud is the next big thing, and NVidia is looking to capture market share with a cloud offering specifically for gaming. It is a good strategy, because there is a big need for this type of cloud service. The video gaming industry is a $20 billion industry, and video is a key hardware component in gaming. On average you must have a GPU that retails for $300-$500 to get a decent gaming experience, and that needs to be refreshed about every 2-3 years to keep up with the gaming technology.

The average modern video card has a GPU with about 1,000 cores. To refresh memories, CPU’s have 4-8 cores. Yes, the GPU is hundreds of times faster than your Intel Core I-7 CPU. It takes that much compute power to render a 3D world at 30 frames-per-second, with each frame having hundreds of polygon mesh 3D objects on the screen with each object having hundreds of thousands of vertices. Your GPU has to compute all these vertices in 3D space, relative to the player’s viewpoint, and fill in the faces with textures, compute and apply lighting, shading, etc.

That is why a computer with a $75 video card can play streaming video without issues, and choke on a 3D game… because the GPU has to calculate each frame in realtime. This is where the GPU cloud comes in — because if the 3D rendering can be offloaded to a cloud service, then everyone can enjoy games and apps that demand $500 video cards with $75 video cards.

The typical 3D game tracks the players position/viewpoint, the millions of vertices that make up the polygons that define the world and currently in the scene.  The orientation of all those vertices from the perspective of the players viewpoint are calculated by the GPU, and then the faces of the polygons are filled in with the textures. All this is done by your high end GPU…. but in the future would be done by the gaming cloud instead.

To achieve this, games would be written to be cloud friendly, and a module would run on the cloud which knows the game video make-up: the objects in the world, the polygons/vertices that make up each,  and the textures that are needed for the polygon faces. All of this is stored on your PC today taking gigabytes upon gigabytes of space, because the graphic rendering is done by your PC. All of this will be offloaded to the cloud, and the local game engine would simply transmit your position, and the position of other objects up to the game cloud, and the rendered video comes back just like streamed digital content.

There is an additional layer of  complexity in 3D rendering — light sources. I left that out in the above for simplicity. But the rendering of a scene is also impacted by the objects that emit light, such as sunlight in the scene, headlights on cars, etc. The gaming cloud would also know the lighting objects in the world/scene in order to also apply lighting/shadowing to the scene just as your local GPU does today.

This is all very doable, and in the works. NVidia is working on it, and Amazon has some GPU specific cloud offerings right now.

This is exciting and will have a huge impact on the gaming industry because we will see the graphics capabilities in games take huge leaps forward. The reason is simple — game developers have to develop the game for a hardware platform that the majority of the user base can afford in their homes at the current time. About every 5 years we see games that up the ante and offer the next generation of video complexity/detail in a title; but when that happens the studio is bombarded with complaints and criticism because a good segment of the user base doesn’t have the hardware required to run the game. When this happens, it is very costly and damaging to the studios industry reputation, so it makes sens that studios will try to avoid huge leaps forward in order to steer clear of that pain.

It is hard to predict what the business model will be with gaming in the cloud. I would expect the model to be some type of monthly charge based on use time and GPU compute power. Will that charge go to the end user, or will the studios eat that cost and roll it into the price of the title?

One barrier that exists to transitioning to cloud gaming is that people who do have high end GPUs today will not want to pay for a cloud service because they don’t need it, having already made the investment in high end GPU hardware. Because of this I would anticipate games will be released that offer local or cloud GPU support, which will allow the consumers that have the hardware to run it to get their ROI, whilst allowing those who do not hard the GPU hardware to run the game without a hardware investment.

Console systems (Playstation, XBox, wii, etc) will come down drastically in price, and possibly be obsoleted. The graphics hardware is the biggest driver of the cost of the system. Once graphics are offloaded to the cloud, the console system will become a thin client streaming device.

Gaming on mobile devices (Andoid/iOS) will be much improved. Mobile devices have serious constraints that create a barrier to good gaming.

  • Tight power consumption constraints as a tablet that only lasts 2 hours on battery won’t sell, and that means advanced graphics compute power isn’t possible.
  • Tight hardware cost constraints, because the price point for the average table is no longer $600, which also means advanced graphics capabilities isn’t possible.
  • Tight storage constraints, because the average mobile device has 16GB of storage space, which means games cannot have 20GB of textures stored on the device.

Cloud gaming will alleviate all these constraints and you will be able to play the same titles you can play on a PC or console.

Lastly, the losses in revenue due to piracy will be alleviated and the headaches of digital rights management (DRM) will no longer be necessary. People will not be able to play games they did not purchase because most of what the game does is offloaded to a cloud. People who don’t pirate games won’t have to pay a slightly higher price in order to offset the losses suffered by those that do pirate games.

Happy Gaming!

Tom C

PC Gaming Master Class: Tips for Running Assassin’s Creed Unity

Gaming concept.
The release of “Assassin’s Creed Unity” (ACU) on the PC platform has been met with a frustrated user base that finds the performance of the game to be unacceptable. This is very common when a game is released which marks the transition to the next generation in 3D gaming. Having over 20 years experience with PC gaming, hardware, and tweaks & hacks, I’ll offer my advice.

The fundamental problem is that ACU isn’t developed based on 5-10 year old 3D game engine technology, and therefore people who try to run it on hardware that is technology 5 years old find it doesn’t run very well, and in order to run the game with the highest graphics settings today you need high-end hardware.

First and foremost, set realistic expectations. This game is visually stunning and the level of detail and number of objects on the screen are incredibly high. Every one of those objects — from the NPC’s, inanimate objects, to the leaves on the trees are made up of hundreds of thousands of polygons, which your graphics card has to compute based on the viewpoint, and render with the textures, and then apply lighting & showing etc. Your CPU has to feed your GPU the necessary data in order for the GPU to do it’s job — like calculate the movement of the NPC’s, (non-player characters), the leaves blowing in the wind, your character movement, etc. And all this takes RAM as well — both in your graphics card and on your motherboard.

Each scene in ACU has 50-100 NPC’s on screen at any given time, and each of those takes CPU and GPU cycles. Contrast that with, say Grand Theft Auto IV, which has 5-10 NPC’s in view at any given time.

GTA4 is a good example to make a few points. When GTA4 was released circa 2008, it met with similar performance issues upon release, because people with the best systems of that era found for the first time they couldn’t play at max graphics settings. It took a few patches to ring out the issues and many years for the technology of the average PC user increased to the point that the game was no longer such a GPU/CPU hog. The cost of CPU/GPU/Memory required to run GTA4 well in 2008 would have cost $8,000, and over the course of a few years that same compute power is found in $799 department store PCs.

So as frustrated as we are that we can’t set everything to max quality, we have to set the graphics quality to run on what we have today. If it is any consolation, the people that don’t abandon this game now out of frustration can look forward to this game being relevant for the next 10 years. That is about how long it till take before new games start making ACU look like yesterdays 3D gaming technology.

All that being said, let me make some suggestions….

1) Stability & Heat/Power – Make sure your PC is stable to begin with. The fact that you haven’t had issues doesn’t mean much, because you haven’t pushed your CPU/GPU/MB this hard — ever. You may have heat dissipation issues or an underpowered power supply, which has never materialized into issues because nothing has pushed it hard enough.

Download FurMark 3D burin-in/benchmark test, and OpenHardware Monitor. Use Furmark to push your GPU & CPU to max for as long as you typically play (hours) and use OpenHardware Monitor to make sure that your CPU/MB/GPU temps remain stable, and the various voltages remain stable. If you see temps just keep rising, then you have a heat dissipation problem. You will need to look at a better heat sink & fan for the CPU, better ventaled case with plenty of additional fans. If you see voltages are unstable, then you have a power supply issue. Either of these issues are going to make your PC unstable.

2) Video Drivers — It is fairly obvious that NVidia is in a partnership with Ubisoft (and everyone else) and you can bet the development and testing was targeted at NVidia first, and everyone else second. If you have an NVidia card, you are probably going to be better off with this game (and many others). The game is develop0ed against DirectX 10 (IIRC) and should work with any vendor’s card that also conforms to that spec. But in reality, we often find that vendors interpret standards differently, etc.

That being said, even if you are running and NVidia card, you might have some issues. First, you need to be on the latest driver (344.75 IIRC). I am surprised at the number of people on Ubisoft’s forums were running old versions, then upon suggestion to update the driver, they said it helped a lot. Updating your drivers should be something you consider doing anytime a new driver is out — and especially when you are running a new release and having issues.

That being said…. I have found some issues with NVidia’s stack related to stability. I was working on a system that had lockups in 3D games, and found that removing the drivers and re-installing but not installing NVidia Game Experience resolved the issues. When I was troubleshooting this, I found hits on the net that indicated a certain process was known to have memory leaks, and I looked at that process and it was growing in memory consumption. I don’t recall what the process was named specifically, but I think it has to do with NVidia streaming and is part of the Gaming Experience software, so I don’t run that. Yes, I know if I had an NVidia Shield I’d be in a pickle. Run the driver only and see if stability improves, and if you need functionality that Game Experience provides then contact NVidia and put the onus on them to resolve why stability decreased when you install Gaming Experience.

Also, in late October and into early November NVidia released driver updates like every few days. I would venture a guess that was related to this game and issues identified in testing that required driver fixes. So I doubt anything less than 344.75 can be expected to work.

NOTE – It looks like Nvidia released 347.09 on 12/23 which I was not aware of and will test next.

Some people will find that other older games may not work well with anything newer than a legacy version. There isn’t much you can do about that, and the avenue for resolving that issue is with the maker of the other game which doesn’t work on new drivers.

I am sure that those running other video card makes will also benefit from using the latest drivers, and probably also should install the driver only and not the additional crap the vendor includes.

3) Video Card Hardware – I run ACU on a GTX760 based system, and a GTX750 based system. It is doubtful that a good gaming experience can happen with a GTX6xx series video card — even though I am sure the system requirements say that will work.

4) Memory — you need a lot of it and it needs to be top notch. Good experienced system builders know that stability is often impacted by memory chips that should work fine in the motherboard but just don’t. Mixing and matching brands/types should be avoided. If your system came with 8GB of generic memory, and you want to upgrade it to 16MB, for best results you should avoid keeping the existing 8GB and adding another 8GB of another brand, type, or frequency. For best results, run memory that is specifically listed by your system or motherboard maker as being compatible. This memory will typically cost more, but that is the price you pay for the best stability and performance. You can try running other memory or combinations, and you might find success or not.

5) BIOS Firmware – If your system has a newer BIOS firmware than what you are running, you should upgrade. Firmware releases typically address stability issues discovered with certain memory brands/types. If you are running memory not listed as specifically compatible, this might just make your memory work.

6) CPU — Using an Intel CPU typically will result in the best compatibility. AMD makes a fine CPU, but it is not unheard of that issues are related to the CPU. You can count on the majority of development & testing is on Intel CPU, so while using AMD might save a few bucks you pay for possibly being impacted by issues.

That being said, you need a powerful CPU. Yes, the graphics card bears the brunt of the abuse in a 3D game, but the CPU needs to track & manage all the objects and such.

7) System Overclocking – It should go without saying that overclocking is done at one’s own risk. If you are overclocking, and even if you have never had any issues before, go back to default settings and make sure overclocking isn’t your culprit.

8) Graphics Settings — Set them to low and see if your gaming experience and stability improves. Yes, we all want to run on high, but be realistic. You can use FRAPS to see what your frame rate is. Use advanced graphics settings to change one thing at a time, and examine how that improves the visuals and what it does to the frame rate. Pick the combination that gives the best look but keeps FPS at 30. I am currently running with everything set fairly high, but anti-aliasing is set low. For me, this is what kills frame rate — and not just in this game.

Best of luck, happy gaming, and patience!

Tom C