Digital Rights Management And Piracy Affects Ubisoft Fans

Word cloud for Intellectual property. Today I’m going to provide some detail around some of the issues around digital rights management, and how that impacts how software publishers do business.

I obtained a key through what I thought was an authorized/legit 3rd party at the end of December and yesterday my key got banned. I didn’t even think to contact Ubisoft to complain or ask for help, but I can see how people would think this is a Ubisoft issue because the key “worked” for a period of time, and it was Ubisoft’s action to ban the key.

This is a digital rights management (DRM) issue. Certainly everyone can understand that game publishers have to take measures to prevent piracy of their work, because for whatever reason people don’t view stealing a piece of software as an equal moral crime as walking into WalMart and stealing a hard copy of a DVD. Anti-piracy measures often do impact paying customers in the form of inconvenience or restrictions, such as in the olden days requiring a physical CD or DVD to be in the drive in order to play the game.

One common way DRM is done is through key codes. It is not possible to hard code all the possible key codes into the game code, and not include any that aren’t yet sold, because once you buy the game the game wouldn’t know about your transaction. So there have to be unused keys within the game code, and when you buy a code from Ubisoft (in this case) or a partner such as Steam, you can rest assured that the code you bough will work for good.

You can probably guess that the keys are driven by an algorithm so that the key can be traced to where it was (or should have been) sold from. I am sure Ubisoft knows which keys are from their direct sales, steam, etc. and I doubt you’ll ever find a case where these keys did not work or got banned.

So how can a key work at first? Simple – someone found a way to break the algorithm and obtain a code from the “not yet sold” pool of codes the game can recognize and sold one to you. But that doesn’t mean you are home free yet.

The second part of code-based DRM involves tracking the codes in use and trying to enforce and police that only properly procured codes work indefinitely. In the olden days, before games were Internet connected, people could and did “share” copies of games by sharing their CD key. You were supposed to register your CD key and tie it to your email address, phone number, or something. Then when you call for support they know you are a legit paying customer… of if they’ve seen 500 registries against the same code they would know that 499 of them are thieves and 1 was a purchaser but enabler of thievery so they didn’t mind pissing that one guy off.

Or, people found ways to simply crack the key algorithm and use for their own use or sell to unsuspected buyers.

In this day and age, games don’t even work if your don’t connect the game to some online identity. That allows for a better player experience through multiplayer, online communities, etc. But that also allows the intellectual property owner to better enforce that only legit paying customers can play the game indefinitely, because they have direct visibility into everyone that is playing the game, the codes used to authorize the game, etc.

The unfortunate fact is that everyone that is impacted by a banned key is in that position because they have been sold a code that, for whatever reason, turns out to not be legit. The code may have been intended for use in Thailand where the game retails for $30 and the player’s IP address obviously comes from the US where the game retails for $60. I am making that up as an example by the way, but yes games. movies, etc. have a different retail value in different countries due to local economic conditions, value of their currency, etc. So rather than say “tough” and charge Thailand people the equivalent of $60 USD, which most can’t pay, publishers discount to be competitive in that local market.

But keys that should have gone to a non-US region that show up as coming from players in the US, will be considered invalid.

Another scenario is the business relationship between third-party and the publisher. Obviously Steam, Amazon, Best Buy etc. have some type of relationship and agreement with UbiSoft to re-sell the game digitally and issue codes. I am sure that relationship includes paying Ubisoft the agreed wholesale cost for each purchase, and the retailer gets a small margin. The wholesale cost is probably not $20, and hence why all the big retailers have the same price Ubisoft has…. $60 USD. But Ubisoft and the re sellers account for the number of copies sold and compensation back to Ubisoft.

The retailers selling this game at $25-$30 may or may not be operating within an agreement with Ubisoft. What is probably happening is these sellers have figured out how to crack the keys, or get a hold of a lot which they aren’t entitled, or buy a lot on bad credit, fake bank account, stolen credit card etc. I am not saying the retailers that sell keys that end up getting banned are doing these illegal things. Some are, hopefully many are not. But they still, for whatever reason, haven’t lived up to operating within some agreement with Ubisoft to be able to sell the game.

I was very disappointed when the key I bought for $35 USD went banned. Despite my best attempts to research the seller and not finding any rip-off reports, etc. I obviously ended up in the middle of an attempt to deprive Ubisoft of the rightful compensation for use of their intellectual property.

I bought my first copy through Steam for a steep discount in November as gift to my Son… I can’t recall if it was $19.99 or $29.99, but it was not $59. Obviously a sale like that would be done in accordance with an agreement from the wholesaler. The site I bought from “explained” why they can offer the game for $35 USD and that was because they purchase codes in volume, etc. So I bought into it when I wanted a second copy for myself. Clearly, I was misled since my code got banned a handful of weeks later.

I contacted that seller promptly, and gave them two options — either an immediate refund or that they would work out whatever issue is between THEM and Ubisoft and get me working within 4 hours. They tried to talk me into giving them “time” to investigate and get back to me, but at that point I know what I was dealing with and what they would try to do — which is to get me off their case and then not reply to future communications, etc.

I told them either refund for that I would detail the experience on my blog for others to be ware and I would track down their legal entity and file a small claim lawsuit and asked if that was worth $30. The response I got back (in support chat) was “No, it isn’t.” and then they proceeded to refund my PayPal account.

This is what people who got pulled into this issue need to do — pin it on your retailer and force them to make it right. The retailer will try to direct you to Ubisoft just to get you off their case.

That being said…. Ubisoft needs to re-examine their process around enforcing DRM to include some form of forewarning notification, explanation of the above, etc. But perhaps they have tried that in the past and it just ends up costing them a lot of money handling the complaints that will arise from the people that won’t get that this is a retailer issue.

Ubisoft also needs to also examine that households with multiple players sharing one PC should NOT have to buy multiple $60 copies of the game. This gets back to the issue of not being able to have multiple save files, profiles, etc. Since each player needs and online account and such which adds to the cost of Ubisoft’s operations, I could understand a fee of $10-$15 for additional players/profiles.

However, I am sure that if the above was done, that would be another mechanism that would be exploited to deprive Ubisoft from their rightful revenue (by “selling” the additional profiles to others as another complete copy of the game).

I ended up re-purchasing an authorization code from Ubisoft directly, paying full retail, and they were running a special buy-one get-one deal, so I ended up getting a deal.

Thank you,
Tom C

Review of Windows 10 Technical Preview

windows10

Windows 10 is out for technical review by anyone desiring to do so. Simply go to Microsoft’s site and look for the trial program, and sign up to get an ISO that will install Windows 10 with a trial license. I would strongly recommend to not wipe your existing OS to evaluate this, and instead create a VirtualBox VM and install to that. VirtualBox is a produce from Oracle which is free for use, and allows you to virtualize a PC to run another OS from within your current OS.

So far I can say it looks promising. The desktop is back to being the center of the OS, and the start menu is back completely. It looks like Windows 10 is a nice shift back to focusing on the things you need in a desktop OS, instead of trying to change your desktop into a big tablet.

Microsoft is famous for following a pattern of “great operating system”  followed by a flop, and then “great operating system”. Windows 8 is to windows 7 what Vista was to Windows XP. Hopefully those that purchased Windows 8 will get a free or very low cost upgrade.

Tom C

Performance Benchmarking Assassin’s Creed Unity


Assassin’s Creed Unity (ACU) is a new release from UbiSoft which has not had a very successful release due to serious technical issues that made the game unplayable by most, and although several patches have been delivered many are still complaining about performance and frame rate. The main complaints seem to be around significant pauses (hang ups) and/or low frame rate (frames per second). Those that are pondering whether to buy the game and want to understand what they can expect on their hardware seek help in the forum on Steam only to be told “you just have to try it and see” or “it doesn’t run well on anything”. Those that have bought the game and want to understand why performance is lacking and would like to know what component should be upgraded are being told “buy a new computer”. In order to understand what compute resources the game is most dependent upon, I ran some benchmarking tests and diagnostics, and the results even surprise me.

In a nutshell, my findings are:

  • The game obviously requires a powerful GPU
  • The game does not require more than 8GB RAM, and doesn’t seem to use more than 6-8GB even if there is a huge surplus in the system.
  • CPU horsepower requirement is not terribly high, and there is not a huge correlation between frame rate and CPU compute power.

For this testing, I used a system that is well tested and known to be stable with the following hardware:

  •  Motherboard: ASUS P8Z77-V LK
  • CPU: Intel Core i7-3770K running at 3.5Ghz (single processor with 4 cores and 2 threads per core)
  • RAM: 24GB Kingston DDR3
  • GPU: MSI NVidia GTX-760 GPU
  • Disk System: OS is on an SSD drive, game binaries are on a hardware RAID comprised of three (3) SATA 6G/s drives.
  • OS: Windows 7 Professional 64-bit
  • Video Driver: NVidia version 347.09, driver only (NVidia Gaming Experience not installed)
  • Displays: Three 1080p LCD displays, not configured as a single display with NVidia Surround.

To perform the tests, I ran the game in windowed mode at 1920×1080 one one monitor, and on another monitor I ran windows performance monitor (part of Windows 7) to track CPU utilization, memory utilization, CPU frequency %, page faults, and processor queue length. On another monitor I ran FurMark OpenGL Stress Test (open source from http://www.geeks3d.com), but used the “CPU burner” to drive CPU utilization up to starve the game of CPU bandwidth. Lastly, I used FRAPs to quantify frame rate.

I ran the game in “low” graphics quality setting versus “Ultra”, and used CPU Burner to see what would be the affect of having less CPU power available to the game.

Running the game in “low” graphics settings, the frame rate was consistently in the 40-50 fps range, sometimes dipping to 25fps. CPU utilization was about 40% on average. Memory usage was about 4GB and did not fluctuate. There were no “hang ups” where the game pauses for a second or so.

Running CPU Burner with 8 threads did not affect frame rate significantly and did not cause the game to pause or stutter.  However, overall CPU utilization was 100% because both the game and CPU Burner were competing for CPU.

Running the game in “ulta” graphics setting, the frame rate was consistently around 30fps, dipping at times to 20-25. CPU utilization was significantly lower, approximately 20% on average. This makes sense because less graphics processing is offloaded to the GPU. With GPU Burner running, I did experience a few pauses.

The “Ultra” graphics quality preset is defined to have all graphics options set to the highest value (environment, texture, shadows, etc. ) except anti-aliasing is not maxed and is set to MSAA-2X.     The final test performed was to also increase anti-aliasing to the highest setting “TSAA” (Tessellation).

Running the game with all settings maxed, and with CPU Burner off, the frame rate was 25-30 fps on average, dipping below 20fps. There were no pauses or hang ups, but the game was very choppy due to the low frame rate. Turning on CPU Burner didn’t bring frame rate down significantly, but the pausing was frequent and pronounced.

The result of these tests show that increasing memory above 8GB will not benefit performance of the game, and performance is mostly determined by the GPU and CPU. A system with a sub par amount of CPU bandwidth can run the game fairly well if the GPU is hefty (by running with all graphics settings maxed which alleviates the CPU from some of the work). A system with a hefty CPU but lower end GPU can run the game fairly well by turning off anti-aliasing. But a system with sub par CPU and a GPU at the lower end of the gaming spectrum doesn’t really have a chance of running without frame rate dropping into the teens and pausing.

Lastly, I will offer some opinion on how playable the game is on non-ideal hardware. This game is well playable with frame rate averaging in the high 20s. It is not a flight or racecar simulator, and the highest speed is jogging pace. Most of the time the player is standing still. Those that adhere to the rule of thumb that says “fps must be above 60 or it sucks”  and “graphics settings must be maxed or it sucks” don’t stand a chance being happy.

If you want a higher frame rate, run lower graphics settings. The visual difference is not that significant. But if you want the best visuals and can’t tolerate 30fps, you had better be running a GTX 900 series GPU.

Running two GPUs in SLI mode would be a good idea, except there are a lot of reports that patch 4 broke SLI. So don’t run out and buy a second video card in hopes of improving the experience with this game.

Happy Gaming!

Tom C

Tom C Forecast: Cloud Gaming Will Transform the Gaming Industry

VGAcardseries2

The cloud is the next big thing, and NVidia is looking to capture market share with a cloud offering specifically for gaming. It is a good strategy, because there is a big need for this type of cloud service. The video gaming industry is a $20 billion industry, and video is a key hardware component in gaming. On average you must have a GPU that retails for $300-$500 to get a decent gaming experience, and that needs to be refreshed about every 2-3 years to keep up with the gaming technology.

The average modern video card has a GPU with about 1,000 cores. To refresh memories, CPU’s have 4-8 cores. Yes, the GPU is hundreds of times faster than your Intel Core I-7 CPU. It takes that much compute power to render a 3D world at 30 frames-per-second, with each frame having hundreds of polygon mesh 3D objects on the screen with each object having hundreds of thousands of vertices. Your GPU has to compute all these vertices in 3D space, relative to the player’s viewpoint, and fill in the faces with textures, compute and apply lighting, shading, etc.

That is why a computer with a $75 video card can play streaming video without issues, and choke on a 3D game… because the GPU has to calculate each frame in realtime. This is where the GPU cloud comes in — because if the 3D rendering can be offloaded to a cloud service, then everyone can enjoy games and apps that demand $500 video cards with $75 video cards.

The typical 3D game tracks the players position/viewpoint, the millions of vertices that make up the polygons that define the world and currently in the scene.  The orientation of all those vertices from the perspective of the players viewpoint are calculated by the GPU, and then the faces of the polygons are filled in with the textures. All this is done by your high end GPU…. but in the future would be done by the gaming cloud instead.

To achieve this, games would be written to be cloud friendly, and a module would run on the cloud which knows the game video make-up: the objects in the world, the polygons/vertices that make up each,  and the textures that are needed for the polygon faces. All of this is stored on your PC today taking gigabytes upon gigabytes of space, because the graphic rendering is done by your PC. All of this will be offloaded to the cloud, and the local game engine would simply transmit your position, and the position of other objects up to the game cloud, and the rendered video comes back just like streamed digital content.

There is an additional layer of  complexity in 3D rendering — light sources. I left that out in the above for simplicity. But the rendering of a scene is also impacted by the objects that emit light, such as sunlight in the scene, headlights on cars, etc. The gaming cloud would also know the lighting objects in the world/scene in order to also apply lighting/shadowing to the scene just as your local GPU does today.

This is all very doable, and in the works. NVidia is working on it, and Amazon has some GPU specific cloud offerings right now.

This is exciting and will have a huge impact on the gaming industry because we will see the graphics capabilities in games take huge leaps forward. The reason is simple — game developers have to develop the game for a hardware platform that the majority of the user base can afford in their homes at the current time. About every 5 years we see games that up the ante and offer the next generation of video complexity/detail in a title; but when that happens the studio is bombarded with complaints and criticism because a good segment of the user base doesn’t have the hardware required to run the game. When this happens, it is very costly and damaging to the studios industry reputation, so it makes sens that studios will try to avoid huge leaps forward in order to steer clear of that pain.

It is hard to predict what the business model will be with gaming in the cloud. I would expect the model to be some type of monthly charge based on use time and GPU compute power. Will that charge go to the end user, or will the studios eat that cost and roll it into the price of the title?

One barrier that exists to transitioning to cloud gaming is that people who do have high end GPUs today will not want to pay for a cloud service because they don’t need it, having already made the investment in high end GPU hardware. Because of this I would anticipate games will be released that offer local or cloud GPU support, which will allow the consumers that have the hardware to run it to get their ROI, whilst allowing those who do not hard the GPU hardware to run the game without a hardware investment.

Console systems (Playstation, XBox, wii, etc) will come down drastically in price, and possibly be obsoleted. The graphics hardware is the biggest driver of the cost of the system. Once graphics are offloaded to the cloud, the console system will become a thin client streaming device.

Gaming on mobile devices (Andoid/iOS) will be much improved. Mobile devices have serious constraints that create a barrier to good gaming.

  • Tight power consumption constraints as a tablet that only lasts 2 hours on battery won’t sell, and that means advanced graphics compute power isn’t possible.
  • Tight hardware cost constraints, because the price point for the average table is no longer $600, which also means advanced graphics capabilities isn’t possible.
  • Tight storage constraints, because the average mobile device has 16GB of storage space, which means games cannot have 20GB of textures stored on the device.

Cloud gaming will alleviate all these constraints and you will be able to play the same titles you can play on a PC or console.

Lastly, the losses in revenue due to piracy will be alleviated and the headaches of digital rights management (DRM) will no longer be necessary. People will not be able to play games they did not purchase because most of what the game does is offloaded to a cloud. People who don’t pirate games won’t have to pay a slightly higher price in order to offset the losses suffered by those that do pirate games.

Happy Gaming!

Tom C

PC Gaming Master Class: Tips for Running Assassin’s Creed Unity

Gaming concept.
The release of “Assassin’s Creed Unity” (ACU) on the PC platform has been met with a frustrated user base that finds the performance of the game to be unacceptable. This is very common when a game is released which marks the transition to the next generation in 3D gaming. Having over 20 years experience with PC gaming, hardware, and tweaks & hacks, I’ll offer my advice.

The fundamental problem is that ACU isn’t developed based on 5-10 year old 3D game engine technology, and therefore people who try to run it on hardware that is technology 5 years old find it doesn’t run very well, and in order to run the game with the highest graphics settings today you need high-end hardware.

First and foremost, set realistic expectations. This game is visually stunning and the level of detail and number of objects on the screen are incredibly high. Every one of those objects — from the NPC’s, inanimate objects, to the leaves on the trees are made up of hundreds of thousands of polygons, which your graphics card has to compute based on the viewpoint, and render with the textures, and then apply lighting & showing etc. Your CPU has to feed your GPU the necessary data in order for the GPU to do it’s job — like calculate the movement of the NPC’s, (non-player characters), the leaves blowing in the wind, your character movement, etc. And all this takes RAM as well — both in your graphics card and on your motherboard.

Each scene in ACU has 50-100 NPC’s on screen at any given time, and each of those takes CPU and GPU cycles. Contrast that with, say Grand Theft Auto IV, which has 5-10 NPC’s in view at any given time.

GTA4 is a good example to make a few points. When GTA4 was released circa 2008, it met with similar performance issues upon release, because people with the best systems of that era found for the first time they couldn’t play at max graphics settings. It took a few patches to ring out the issues and many years for the technology of the average PC user increased to the point that the game was no longer such a GPU/CPU hog. The cost of CPU/GPU/Memory required to run GTA4 well in 2008 would have cost $8,000, and over the course of a few years that same compute power is found in $799 department store PCs.

So as frustrated as we are that we can’t set everything to max quality, we have to set the graphics quality to run on what we have today. If it is any consolation, the people that don’t abandon this game now out of frustration can look forward to this game being relevant for the next 10 years. That is about how long it till take before new games start making ACU look like yesterdays 3D gaming technology.

All that being said, let me make some suggestions….

1) Stability & Heat/Power – Make sure your PC is stable to begin with. The fact that you haven’t had issues doesn’t mean much, because you haven’t pushed your CPU/GPU/MB this hard — ever. You may have heat dissipation issues or an underpowered power supply, which has never materialized into issues because nothing has pushed it hard enough.

Download FurMark 3D burin-in/benchmark test, and OpenHardware Monitor. Use Furmark to push your GPU & CPU to max for as long as you typically play (hours) and use OpenHardware Monitor to make sure that your CPU/MB/GPU temps remain stable, and the various voltages remain stable. If you see temps just keep rising, then you have a heat dissipation problem. You will need to look at a better heat sink & fan for the CPU, better ventaled case with plenty of additional fans. If you see voltages are unstable, then you have a power supply issue. Either of these issues are going to make your PC unstable.

2) Video Drivers — It is fairly obvious that NVidia is in a partnership with Ubisoft (and everyone else) and you can bet the development and testing was targeted at NVidia first, and everyone else second. If you have an NVidia card, you are probably going to be better off with this game (and many others). The game is develop0ed against DirectX 10 (IIRC) and should work with any vendor’s card that also conforms to that spec. But in reality, we often find that vendors interpret standards differently, etc.

That being said, even if you are running and NVidia card, you might have some issues. First, you need to be on the latest driver (344.75 IIRC). I am surprised at the number of people on Ubisoft’s forums were running old versions, then upon suggestion to update the driver, they said it helped a lot. Updating your drivers should be something you consider doing anytime a new driver is out — and especially when you are running a new release and having issues.

That being said…. I have found some issues with NVidia’s stack related to stability. I was working on a system that had lockups in 3D games, and found that removing the drivers and re-installing but not installing NVidia Game Experience resolved the issues. When I was troubleshooting this, I found hits on the net that indicated a certain process was known to have memory leaks, and I looked at that process and it was growing in memory consumption. I don’t recall what the process was named specifically, but I think it has to do with NVidia streaming and is part of the Gaming Experience software, so I don’t run that. Yes, I know if I had an NVidia Shield I’d be in a pickle. Run the driver only and see if stability improves, and if you need functionality that Game Experience provides then contact NVidia and put the onus on them to resolve why stability decreased when you install Gaming Experience.

Also, in late October and into early November NVidia released driver updates like every few days. I would venture a guess that was related to this game and issues identified in testing that required driver fixes. So I doubt anything less than 344.75 can be expected to work.

NOTE – It looks like Nvidia released 347.09 on 12/23 which I was not aware of and will test next.

Some people will find that other older games may not work well with anything newer than a legacy version. There isn’t much you can do about that, and the avenue for resolving that issue is with the maker of the other game which doesn’t work on new drivers.

I am sure that those running other video card makes will also benefit from using the latest drivers, and probably also should install the driver only and not the additional crap the vendor includes.

3) Video Card Hardware – I run ACU on a GTX760 based system, and a GTX750 based system. It is doubtful that a good gaming experience can happen with a GTX6xx series video card — even though I am sure the system requirements say that will work.

4) Memory — you need a lot of it and it needs to be top notch. Good experienced system builders know that stability is often impacted by memory chips that should work fine in the motherboard but just don’t. Mixing and matching brands/types should be avoided. If your system came with 8GB of generic memory, and you want to upgrade it to 16MB, for best results you should avoid keeping the existing 8GB and adding another 8GB of another brand, type, or frequency. For best results, run memory that is specifically listed by your system or motherboard maker as being compatible. This memory will typically cost more, but that is the price you pay for the best stability and performance. You can try running other memory or combinations, and you might find success or not.

5) BIOS Firmware – If your system has a newer BIOS firmware than what you are running, you should upgrade. Firmware releases typically address stability issues discovered with certain memory brands/types. If you are running memory not listed as specifically compatible, this might just make your memory work.

6) CPU — Using an Intel CPU typically will result in the best compatibility. AMD makes a fine CPU, but it is not unheard of that issues are related to the CPU. You can count on the majority of development & testing is on Intel CPU, so while using AMD might save a few bucks you pay for possibly being impacted by issues.

That being said, you need a powerful CPU. Yes, the graphics card bears the brunt of the abuse in a 3D game, but the CPU needs to track & manage all the objects and such.

7) System Overclocking – It should go without saying that overclocking is done at one’s own risk. If you are overclocking, and even if you have never had any issues before, go back to default settings and make sure overclocking isn’t your culprit.

8) Graphics Settings — Set them to low and see if your gaming experience and stability improves. Yes, we all want to run on high, but be realistic. You can use FRAPS to see what your frame rate is. Use advanced graphics settings to change one thing at a time, and examine how that improves the visuals and what it does to the frame rate. Pick the combination that gives the best look but keeps FPS at 30. I am currently running with everything set fairly high, but anti-aliasing is set low. For me, this is what kills frame rate — and not just in this game.

Best of luck, happy gaming, and patience!

Tom C

Cloud!

CloudIt is late in the year 2014, and for the past several years the various vendors have been preparing their cloud solutions. Amazon lead the charge and pretty much set the industry standard with EC2 and now all the vendors are following suit, and the CIOs and CTOs have to figure out how best to integrate cloud into their IT strategy. Those that have built careers by building out huge data centers, strategically placed across the globe with thousands of purchased servers, are finding it difficult to transform.

The idea of cloud is simple… its infrastructure as a service (IaaS). You need a server, you go up to the cloud and provision one, and a few minutes later you receive the details and credentials to connect, and you are logged in and installing the packages needed to support the application stack. You need ten servers? A hundred? No problem! The cloud provides all you need, and the supporting services, like virtual networks, storage, directory services, etc.

Gone are the days of old, where you had to wait months for hardware procurement, followed by weeks for the various infrastructure silos to rack/stack/blast/config your server and hand it off to your AppDev team. Let’s not forget about the weeks that get tagged onto that provisioning cycle for the various approvals and sign-offs.

No, we don’t do that anymore.  Now you can chop months off the time-to-market for your great new business enabling technology idea… or at least that should be the case.  The challenge is the infrastructure camps have resisted embracing the cloud because that would require they reinvent pretty much all that they do; and they have spent 20+ years learning how to be a server provisioning/deployment shop.

There are valid reasons to take pause and do a lot of research and deep planning on your cloud strategy. In order to successfully integrate the cloud into your IT business model,  you will have a lot of issues to resolve in the regulatory/compliance/legal/hr/risk/ITsec areas. All of these corporate executives know that their best interests are served by not letting data reside on third party systems — which means a public cloud service is out of the question. Or, perhaps you can come out of those deep discussions with agreement that data can be assigned to categories, and certain data can reside in a loud, but some can’t — which is good, but you now have created another thing that needs to be policed and audited. Or, you might be able to sell the idea that an encrypted VPN to a cloud, plus the cloud vendor agreeing to ring fence your cloud components completely — which also would need to be policed and audited.

Yes, the cloud solves a lot of problems, but creates some risks and requires a lot of stuff to be thought about and ironed out. The time to do that, obviously, is before putting out the capital expense. You can’t really even decide what kind of cloud you will have until these discussions have happened. It is either going to be 100% in the public cloud (not likely for most companies), or 0% in a public cloud and 100% in an internally grown cloud, or some combination of the two.

The problem with the latter two scenarios, where some form of internal, home grown cloud is in the strategy, is that the internal cloud needs to be designed, built, deployed, and operated by the same infrastructure folks that have spent the past two decades convincing themselves that a 4 month infrastructure procurement and deployment cycle is perfectly fine… and the same infrastructure organization that at this very moment probably can’t provide transparency into the exact makeup of your allocation charges.

I don’t mean to sound critical of infrastructure organizations, but the scenario I just described exists all over the place in the largest corporations across the globe.   As a result, senior managers in infrastructure now have to figure out how to build a cloud that is as mature and well-thought out as Amazon’s EC2 whilst starting a decade behind.

You are competing with Amazon’s mature, battle tested cloud solution because your AppDev teams have been playing with it at home, or they came from jobs that could leverage Amazon’s public cloud. They know that procuring a Windows or Linux box should take about 8 minutes. They know that they can expect to be charged 17 cents and hour, and only be charged when the server is powered up. They know that they need to design their apps to scale out and have the app know how to create more servers in the cloud to accommodate workload peaks and scale down to reduce cost.

All of that means your cloud needs to be able to do all that. Automated provisioning and decommisioning, fully baked cost allocation models,  an API, etc. You have to be able to scale the capacity of the underlying hardware infrastructure accordingly and make that transparent to the app layer — because a cloud that is “all filled up” and can’t accommodate new provisioning/growth requests for four months while more SAN shelves and/or hypervisor hosts are being procured would take the business right back to the old days of yore.

 

Tom C