Linux as a Desktop OS in 2019

I doubt there are many people that haven’t heard of Linux in 2019. For those that haven’t, Linux is an operating system created in the early 90’s by a University of Helsinki student named Linus Torvalds. He created a Unix-like operating system as part of a curiosity project while studying operating systems, and let’s just say it was well received. Today, in Fortune 500 companies, the number of Linux servers exceeds the number of Windows servers, and Sun servers have pretty much become extinct.

In the desktop space, Linux was not highly viable up until the past 10 years. Today, it is quite viable for both desktops and laptops. Hardware compatibility is very wide, and the far majority of applications are built for Linux in addition to Windows & Mac. However, even in 2019, there are applications that only work in a Microsoft Windows environment. You can guess that Microsoft Office is one such application set. I will talk more about that later.

Before we jump into the things you would gain by running Linux as your desktop or laptop OS, let me just say this blog is not intended to be a “how to” guide and instead is more of “why to” guide.

Why Linux on the Desktop — What Do I Gain?

To put it simply, you gain freedom, security, personal privacy, performance, stability, supportability, lifecycle management, and cost savings. These things are simply inarguable, but there is one more which could be argued: you gain coolness.

Freedom

By transitioning to Linux as your desktop OS, you are no longer locked into a proprietary OS from a vendor that decides what features and technology stacks you need. You run what you need and want to run. If you want to store documents on a cloud, you can decide to do that, but you won’t have OneDrive forced upon you and taking up system resources when you don’t even want it. Same thing with Skype, or OneNote, or Cortana.

Personal Privacy

Speaking of Cortana, you no longer have every word you type spied upon and transmitted to a vendor so they can tailor ads and services to you… and you hope and pray the vendor never gets breached or simply decides to violate their own privacy policy — or have a legal army that they use to blatantly violate your privacy and get away with it. Yes, you can turn off these scary things in Windows 10…. but what if the “turn off ability” is broken?

Security

You no longer have to wait for that vendor to decide to push patches to you one a monthly or quarterly basis. To be fair, Microsoft does do emergency patch releases to the most critical threats.

You also get an OS that doesn’t need a virus scanner running 24/7 — usually. The far majority of malware targets windows systems.

In terms of security, you get an OS that was designed and built from day one to be secure. Linux is secured in its default state and Windows is not. This empowers users with more power than they should have to inadvertently do dangerous things. Here are some more reasons.

Lastly, security is increased and threat risks lowered simply because Linux distributions tend to run only the services and software that the end user specifically asks for. As an example, by default, there is no file sharing services running on a Linux box unless that has been deemed necessary and installed. Windows 10, on the other hand, installs with file sharing enabled and services running. This means whether the user intends to share files, the services are installed and running and presenting a risk of a bad actor or malware on the network penetrating those services. The fewer services you run, the smaller your attack surface and the lower your security risk. An attacker cannot attack a service that isn’t installed or running.

Stability

You no longer have patches pushed to you which have escaped wide testing and which might make your system unbootable or unstable. Here is an example when a  recent emergency patch made some Lenovo laptops unbootable. Here is another example when an emergency patch for mitigate “meltdown” threat made some AMD system unbootable emergency patch made some AMD systems ubootable. Then there was the time a Microsoft patch put systems with Intel SSD’s into a crashing boot loop.

Despite its track record that shows serious patch release management practices, Microsoft still feels it is best to manage patches for the user. What this means is you get patches when Microsoft wants, and can potentially be broken at a time when you couldn’t afford to have downtime.

That isn’t to say that there have never been patches for any Linux distribution which induces problems. When it happens it is pretty rare. But at least you get to choose when to risk it, and you can roll back from it. Which reminds me to mention that in most cases you can’t roll back from a bad Microsoft update.

Microsoft also takes control of what drivers are used for the devices in your system, and in many cases the drivers deployed to end users via Windows updates are a couple versions behind the most current. What this means is you are not benefiting from the device manager’s development efforts. I was working an issue on one Window 10 laptop which had very unstable wireless and would lose all ability to connect until the device was power cycled. I was able to resolve this by installing the most recent driver from the device manufacturer, which would resolve the issue until Microsoft sent updates and effectively rolled the driver back to the problematic one.

Performance

The Linux kernel is far more efficient at managing hardware resources and multitasking/threading, and the end result is things just run faster. There are plenty of examples where people have done performance testing and benchmarks to compare Windows and Linux.

Aside from the kernel simply being better performing, the entire end-user experience sees a performance benefit because of less bloat in the operating system and installed packages.

This means you get more bang for you buck in terms of return on your investment in PC hardware.

Supportability

I have to admit that I really have a negative view of Microsoft in the area of support, and I am sure what I write here will show that negative bias. Over the years I have seen countless times where people are experiencing a valid issue, and they go to a Microsoft community board and post their issue in a very clear and concise manner, which demonstrates they have spent appropriate time before seeking support, and the only thing that comes back from a Microsoft representative is something along the following lines:

“My name is [NAME HERE] and I am sorry to hear you are experiencing problems with [PROBLEM SUMMARY HERE]. I am happy to say I can assist you with this problem. Please do the following:

  • In the search box on the taskbar, enter Command Prompt. Press and hold (or right-click) Command Prompt (Desktop app)from the search results and select Run as administrator.
  • Enter DISM.exe /Online /Cleanup-image /Restorehealth (note the space before each “/”). (Note: This step may take a few minutes to start and up to 30 minutes to run and complete.)
  • Enter sfc /scannow (note the space between “sfc” and “/”

The Microsoft representative almost always chimes in as soon as the thread is started, and has some variation of the above. In 99% of the times, the poor user goes off to do these things, and comes back and says “Hi, I did that and the problem continues… what now?”

What happens next is crickets chirp while waiting for the Microsoft representative to advise next steps, which in most cases doesn’t come; but if next steps come, it is almost always “go into System Restore in control panel and choose the option to Reset This PC”.

That is pretty much the Microsoft support troubleshooting script: dism /restoreHealth and then sfc /scannow…. then reset. For those not aware, “Reset This PC” is a drastic step which, while it does preserve the users’ documents it still is a long process and ultimately, in many cases, still doesn’t fix the problem; because the problem is within Windows and appears even on a fresh install.

How most problems get fixed in Microsoft land is eventually a Microsoft Certified Professional that spends his/her own time helping the community, will come along and post the real resolution or work around.

Over the years I have seen this so many times that it really makes me angry. There is nowhere you can go to see what known bugs are out there, search, and track to see if someone at Microsoft is aware and what the disposition is…. and if it is verified and will be patched, when will that happen.

No, not with Windows, but with the major distributions of Linux, the public does get access to the issue tracking system, can report issues, search, etc. Here is the tracker for Ubuntu.

What Linux users do when they face an issue which doesn’t appear to be within their own control (e.g. their own mistake or mis-configuration), is go to the appropriate issue tracker and see if their problem is already reported, and if so how is it resolved, how can it be worked around, etc.

To really boil it down to the most simple elevator pitch, the free support you get from the community surrounding your completely cost-free Linux distribution is miles better than the support you get from the vendor you paid $139 for a stable operating system which empowers your work rather than impedes it.

Cost

It seems plainly obvious that an operating system that costs $0 is a cost save versus an operating system that costs $139 to license. However, the cost save is actually much greater.

Because running Linux almost always is done in tandem with moving away from all proprietary software and embracing open source alternatives, then there is also the cost save associated from not needing a Microsoft Office license, or Photoshop (GIMP > Photoshop), or 3D Modelling (Blender rules).

In addition to the cost save on the OS and software, you also have the cost gains associated with the increased performance, or the less downtime, etc. Some would agree that you can’t put a pricetag on having an operating system that doesn’t spy on you, isn’t a breeding ground for malware, or is a ticking bomb waiting for an attacker to gain access to your system by exploiting Windows vulnerabilities.

Lifecycle

The leading distributions of Linux have a support timeframe that is much longer then Windows. This means you can run it for many more years and the OS and all the software packages continue to be maintained, and bugs fixed, and security holes closed.

The useful lifespan is increased because of the support cycle, and also because Linux is higher performing, less bloated, and therefore less hardware demanding. To put it simply, as the years go by and more things you probably won’t use are crammed into the Windows OS, those things consume disk and memory and CPU cycles. Linux distributions are intentionally managed so as to not cause this kind of bloat.

There Have To Be Downsides – What Are they?

Complexity and User Learning Curve

Obviously, many of the benefits mentioned also carry downsides if examined from the opposite direction. For example, since Linux is less prone to virus/malware spread because it is not so easy for the user to run a script or executable, the converse of that is that it is — well — harder to use overall. While that is true, it isn’t that much harder to do the things you want and need to do.

Some would disagree, but I still can’t see it happening to manage a Linux system without ever dropping to the command line to get things done. Yes, there are GUI apps that provide access to the command line features, I still can’t say that one should plan on running linux and not plan on learning how to use the command line to get basic things done.

Also, earlier we talked about how Linux doesn’t have all software installed and running by default. We said that was a good thing from a security perspective. But what do you do if you really do want to run an apache web server?

The answer, assuming a Debian derivative distribution (Ubuntu, etc.) is you have to install the Apache Httpd package, and if you are using the built-in firewall you have to configure that to allow httpd traffic. Here is a guide: https://www.digitalocean.com/community/tutorials/how-to-install-the-apache-web-server-on-ubuntu-18-04-quickstart

If one plans to run Linux but doesn’t plan how to follow instructions like this and carry out commands, that is not a good plan. That said, luckily most distributions don’t enable UFW (firewall) by default, so you don’t have to do that [art f the guide, and it boils down to really one command line: “sudo apt install apache2”.

When that command is executed, the system installs any dependent packages that are required and installs the Apache web server in a very basic configuration, which the user can then customize to work as they require.

Everything in Linux is modularised like this, and while it is a good thing from many perspectives, the bottom line is it has a slightly higher learning curve.

The good news is, everyone that is now an expert in Linux and whose guides you will read and follow, they once were also as inexperienced as you. And when you suck it up and dive into it, it won’t be long before you are a much more well-versed computer user.

Hardware Incompatability

If your system, and the devices in or connected to it, are less than 10 years old, you can expect the major Linux distributions to install and detect your hardware and provide and “up and running” experience.

That said, it would be wise to inventory your hardware and specifically research and verify that there are Linux drivers for everything. It used to be that when new hardware would hit the market, on day one it wouldn’t have Linux driver or support and all you could do is hope. It really isn’t like that anymore.

I would be remiss if I didn’t mention that some hardware has been “crippled” so that it can never run on Linux. It is very rare these days, but what used to happen is a device would offload all or part of its firmware to software within the Windows driver, and not make a Linux driver and also not make the firmware open source.

What that means is that it becomes illegal for anyone to even try to reverse engineer the driver to make it work under Linux. This kind of practice used to be somewhat common, but these days is not common as the community has reformed all the major industry players to make “open source” part of their guiding principles. If you are unlucky and have a hardware device that this is being done, you didn’t want to run that device anyways.

You Require Proprietary Software

We will talk about this in great depth next, but if you think you cannot run Linux because you require software that isn’t available on the Linux platform, there are ways around this. So let’s just dive into that now.

What Can I Do If I Want to Run Linux But Must Have Windows Apps Like Microst Office?

It used to be that if you needed Microsoft Office, that was a stopper for adopting Linux as a desktop OS. But over the past few decades a few things have happened that make that less of a stopper: the Windows Emulator (WINE), open source alternatives, and virtualization.

Windows Emulation – WINE

WINE is an emulator for running native Windows applications on Linux. A lot of applications work fully under WINE, some work partially, and some won’t even run. The reality is, you might get Microsoft Office 2007 to run under WINE.

Employ Open Source Alternatives

A better way of handling Microsoft Office under Linux is to avoid it altogether. There are several open source alternatives to Microsoft Office. LibreOffice my favorite. Other options include FreeOffice, WPS Office, Open Office, etc. Any of these alternatives will provide the core functionality of Microsoft Word, Excel, and Powerpoint. Of course, you could also migrate away from Microsoft Office by using the Google Docs apps. That said, you won’t find 100% compatibility. For example, if you are dependent upon VBA macros embedded in MS documents, that is going to be a problem — and it is a problem in itself that needs to be fixed.

It is important to point out, that you do not need to run Linux as your operating system to benefit from the MS Office alternatives. Even if you don’t plan on running Linux, it is still a good idea to decouple yourself from the Microsft Office products. I used to subscribe to Microsoft Office 365 until a few years ago, but terminated my subscription because it wasn’t gaining me anything I can’t get from open source alternatives, and it really did nothing but pollute the Windows event logs.

Virtualization

The last option for dealing with applications that create a stopper for adopting Linux is virtualization. Virtualization is a technology which allows one to create a “virtual” machine (VM) running within the operating system being used. To put it simply, if you are running Linux, you can create a Windows VM, and then within that VM you install and run the software. Windows 7 and 10 run well as virtual machines. Windows XP, 95, and 3.1 also can run as VMs, but why would you?

An interesting note is that Windows 7 professional used virtualization technology to provide 100% backwards compatibility with legacy applications. Most legacy applications could run under Windows 7 by employing a few tricks, but some could not run in any way. Windows 7’s XP mode used Microsoft’s Hyper-V virtualization technology to provide this last-ditch compatibility mode. Launching programs installed in this mode took longer, because of the time it takes to spin up the VM engine, and also ran slower because of the performance hit that came from running the VM engine and the VM.

“Virtualization” Does Not Mean “Slow”

If your saying virtualization was slow in Windows 7, why would I ever want to use virtualization to tear down stoppers to running Linux? Because in 2019, modern CPU’s that have been produced in the last five to eight years, all have virtualization features that drastically reduce the performance overhead. If your CPU is Intel, it should have VT-x. If you have an AMD CPU, it should have AMD-v.

Note that on older systems, the virtualization features might be available in the CPU but not enabled in the BIOS, and therefore it is necessary to go into BIOS and find the setting and enable it. If you are unsure, there are plenty of diagnostic apps that you can use to verify that your system supports VT-x or AMD-v. Or, you can install the virtualization software and it will warn you if VT-x or AMD-v are not found.

You can still virtualize without CPU VM features, but you cannot virtualize a 64-bit operating system and you can expect it to perform poorly.

Virtualization to Run Native Windows Apps

Using virtualization to run native windows apps is as simple as installing virtualization software, creating the virtual machine, and installing the windows OS and whatever required applications.

Virtualization Software

For virtualization software, your choice is either VMware Workstation Player or Oracle’s VirtualBox. VMware Workstation Player is paid software, but you can get a license to use at no cost for your own personal use. On the other hand, Oracle’s VirtualBox is licensed under GNU Public License V2, which for your purposes just means it is free now and will remain so. The problem with using VmWare Workstation Player’s free license is that it is a more restrictive license which could be terminated if VmWare so decides. It isn’t likely they would decide to do that.

Also, VMware’s license could cause you some legal problems if the work you do on that PC is considered “commercial”. From a licensing and legality perspective, your safer choice is VirtualBox. It is important to state this is my opinion and advice, not legal advice.

You Need a Windows License

Running a virtualized instance of Microsoft Windows as a virtual machine requires a license. In other words, you have to buy it. If you are replacing your current windows OS with Linux, and then hope to use your current Windows license inside that VM, that requires some hoops to be jumped through technically, and may not be 100% legal. It may not be legal if your license for windows is tied to the physical hardware, such as an OEM version of Windows that came pre-installed on that PC.

Another option is to use an evaluation copy of windows. Microsoft has offered free eval versions of Windows for quite some time. The problem is the free license typically only lasts 90 days, and can be renewed a limited number of times. After the eval license expires, the copy of Windows becomes quite useless.

In a nutshell, right here is where it can get complicated — not technically but legally. This might be a good place for most to simply decide to cut ties altogether with that piece of Windows-only software.

But if you must continue down this path, please do your homework and make sure you have the needed Windows product code in order to install and activate Windows within the VM.

I’m Ready To Do It!

Ok, so you are ready to go for it. Before you do, I’d like to tell you what to expect. Will I have windows GUI, desktop, taskbar? If so what does it look like? I will talk about that shortly, but let me first make a suggestion.

Try it out first!

You try Linux in a couple ways. Most distributions offer an image you can put on a USB thumb drive, and then boot your system off of that thumb drive. Just be careful not to choose the option to “Install” and instead use the option to “Try It”. Don’t worry, if you click the “Install” button it would take several steps into the procedure, and a few “are you sure you want to do this?” prompts, which would clue you in that you made an error.

Or, you could install the virtualization software (VMWare or VirtualBox) under your current windows, then use that to create a VM and install Linux to that VM. I recommend this, but especially if you plan to transition to Linux as your primary OS but then virtualize Windows from within Linux. By trying in this fashion, you are essentially doing the opposite, and gaining experience with the virtualization software and validating your hardware has the necessary features to virtualize and perform well.

Where Do I Get It, and What Is This “Distribution” Thing You’ve Mentioned?

Linux is not made or sold by any one company. The Linux kernel is the part that Torvalds did and continues to evolve and maintain with a team of global volunteers, but it literally won’t do anything for you alone. You won’t even get a command prompt and you certainly wouldn’t end up with a useful system.

When we say “Linux” these days, that means the Linux Kernel plus all the “stuff” (software layers) that are needed to make good use of that kernel in a modern context. The distributions are the kernel plus the “stuff”. There isn’t one because not everyone needs or wants the same “stuff”.

Just to give some insight int what this “stuff” is, there is not one window-like GUI available for Linux. GNOME 3 is the most popular windowing system, and is comparable to modern Windows UI, except it is a heck of lot more customizable. GNOME 3, being comparable to modern Windows UI, takes a pretty significant bite out of compute resources. Just like Windows “Aero” themes, GNOME 3 has transparaency, animations, etc. But also like Windows Aero, it requires hefty CPU and GPU. If one is less interested in the glitz and glam, and the compute resources it eats up, one might like XFCe better than GNOME 3.

XFCe provides a windowing system but is designed to be lightweight. No transparent windows or bars, no animated windows, etc. But a system running XFCe won’t need as much horse power. But even if the system has the horsepower, perhaps the user doesn’t want that horsepower eaten up by the windowing system.

The point is, you run what you want to run, and only what you want to run and that is usually driven by the hardware you have and how much horsepower you want given to any given compute function.

It isn’t just about the windowing system. All the common software packages can vary. That is really what these distributions are. Folks bundle the Linux kernel with the appropriate software packages in order to fulfill a common set of compute requirements.

A Desktop distribution will give you a windowing system (UI), system utilities, and office packages (LibreOffice), web browser (usually FireFox), etc. and a printing system.

On the other hand, a Server distribution usually doesn’t have the UI, all the utilities are command line, and it doesn’t have office software and the print subsystem is absent. But it does have Apache web server, PHP, a Mysql database server, etc.

That is really what distributions are about — giving you these varieties that best fit your compute needs so you are up and running quick after installing. DO not misunderstand though, you can start with a desktop distribution because you want a window UI and office stuff, and also add the server bits — if you need them. That really is what the flexibility of Linux is all about.

Since we are talking about desktop context here, there are a handful of most popular ones:

  • Ubuntu Desktop – The most popular/used distribution, with a specific “Desktop” blend. Probably the easiest to install and use. Uses GNOME3 desktop by default, but can be changed to XFCe for lighter weight windows system. There is a lighter blend targetted for aged/low powered hardware called LUbuntu… L for “Light”.
  • Linux Mint – Trending these days. Has its own windowing system that was a fork of GNOME. Was originally a derivative of an Ubuntu distribution called KUbuntu, with was an Ubuntu blend that used the KDE desktop environment, which is a windowing system that many preferred in the 2000’s; but even Mint uses a GNOME derivative as its UI shell.
  • Elementary OS – Another Ubuntu derivative, aims to be a “lesser learning curve” OS.
  • ArchLinux – targets x86/64 architectures only, and aims to adhere to the KISS principle, minimalism, etc.
  • Tails – This trending desktop OS specifically aims to provide an experience that focuses on your privacy and security. Way beyond the scope of this article, but people should be concerned about their digital footprint, and using Tails is one way to improve your personal security/privacy posture.

I will provide some visuals now. Please keep in mind that all of these distributions have windowing systems that are highly customizable. I am sure some of the pictures below do not depict the stock “out-of-the-box” look…. even if there is no box.

Ubuntu

Mint

Elementary

ArchLinux

Tail

How Can I Possibly Pick One?

Just do it. If you are taking my recommendation and test driving, either by using a “live” USB or setting up a VM, then you can try several and pick the one you like the most. Spin up a VM and install it and use it. See if you can get through a day by doing everything you need from within that Linux VM, and then a week.

If you must continue using thigs that are only available on Windows, then you will need to do the Windows in a VM thing. You can’t setup the Windows VM inside of the Linux VM to test how well that is going to work. Well, technically you can, but it isn’t fun and usually isn’t worthwhile to do nested VMs. But if you want to gauge how well your Windows VM will perform, it will perform roughly the same as it does when you are using Windows as your primary OS and have the Linux VM up and running at the same time.

The basic idea is your machine has to be able to run two OSes. So by leaving windows as your primary and running a Linux VM to test Linux, you are doing exactly that.

I’ve Spent Some Time Running Linux In A VM

Once you’ve spent some time in Linux on a VM under your current Windows OS, you can gauge what your life would be like if you cut over to Linux as the primary OS. If you can spend the majority of your time in the Linux VM, and only need to use Windows minimally, then cutting over is something to consider.

What About Games?

PC gaming favors Windows. I wish it wasn’t the case, but Linux isn’t a platform many studios target and that doesn’t seem to be changing any time soon. I wish it wasn’t the case, but it is. There was a time that Linux gaming was really growing and maturing, but that stopped many years ago. That isn’t to say there aren’t any games that run natively under Linux, because there are. But a serious PC gamer is not likely to not need Windows for some of the games they play.

But Can’t I Play My Games In That Windows VM Under Linux?

I wish, but the answer is mostly no. Games require GPU acceleration and that is not something you can count on working well in a VM. Yes, both VMWare Workstation Player and Oracle VirtualBox support GPU acceleration in the guest VM, it just doesn’t work as well as it needs to work for modern games. If you want to try it, you best chance of success is to use VMWare rather than VirtualBox becuase VMware has better GPU acceleration.

That said, I really suggest you go into testing it out with low expectations. You might be able to get decent results with older games, but anything you want to play in 2019 isn’t going to work well in a VM.

Sadly, this means PC gamers really need to keep Windows as the primary OS, and run Linux within a VM.

What About Dual-Boot?!

Yes, you could setup to dual boot between Windows and Linux. You would need some unused (unallocated/partitioned) disk which is large enough to hold the 2nd OS. The way it works is you keep your existing (Windows) OS and install Linux on the unused space, and have the GRUB bootloader configured to present a choice of which OS to boot.

This means that you are never in both OS’s at the same time, and that to switch you need to reboot. That gets cumbersome really quick.

The other problem is, setting this up can be tricky, and in the process of trying it is very possible to make your Windows OS unbootable.

The Bottom Line

Thus, my recommendation remains to use one OS as the primary and run the other in a VM, and the one that is the primary would be the OS that you just can’t live without and there is no way to cut it out of your life.

What About Raspberry Pi and Raspian?!?

I would be remiss if I didn’t mention that another way of experiencing Linux is on a $35 device, the Raspberry Pi 3. The best Linux flavor for the pi is the distribution that is maintained for it specifically, which is Raspian. Raspian is a debian derivative (like Ubuntu) but Raspian includes only software packages that are appropriate for the low horsepower pi. For example, it comes with and XFCe based windowing system, etc. since running GNOME 3 on a pi would be extremely painful.

In my experience, using a Raspberry Pi for desktop oriented work is painfully slow. But as anything but a desktop OS, the pi is wonderful. This blog site is running on a pi!

Cloud!

CloudIt is late in the year 2014, and for the past several years the various vendors have been preparing their cloud solutions. Amazon lead the charge and pretty much set the industry standard with EC2 and now all the vendors are following suit, and the CIOs and CTOs have to figure out how best to integrate cloud into their IT strategy. Those that have built careers by building out huge data centers, strategically placed across the globe with thousands of purchased servers, are finding it difficult to transform.

The idea of cloud is simple… its infrastructure as a service (IaaS). You need a server, you go up to the cloud and provision one, and a few minutes later you receive the details and credentials to connect, and you are logged in and installing the packages needed to support the application stack. You need ten servers? A hundred? No problem! The cloud provides all you need, and the supporting services, like virtual networks, storage, directory services, etc.

Gone are the days of old, where you had to wait months for hardware procurement, followed by weeks for the various infrastructure silos to rack/stack/blast/config your server and hand it off to your AppDev team. Let’s not forget about the weeks that get tagged onto that provisioning cycle for the various approvals and sign-offs.

No, we don’t do that anymore.  Now you can chop months off the time-to-market for your great new business enabling technology idea… or at least that should be the case.  The challenge is the infrastructure camps have resisted embracing the cloud because that would require they reinvent pretty much all that they do; and they have spent 20+ years learning how to be a server provisioning/deployment shop.

There are valid reasons to take pause and do a lot of research and deep planning on your cloud strategy. In order to successfully integrate the cloud into your IT business model,  you will have a lot of issues to resolve in the regulatory/compliance/legal/hr/risk/ITsec areas. All of these corporate executives know that their best interests are served by not letting data reside on third party systems — which means a public cloud service is out of the question. Or, perhaps you can come out of those deep discussions with agreement that data can be assigned to categories, and certain data can reside in a loud, but some can’t — which is good, but you now have created another thing that needs to be policed and audited. Or, you might be able to sell the idea that an encrypted VPN to a cloud, plus the cloud vendor agreeing to ring fence your cloud components completely — which also would need to be policed and audited.

Yes, the cloud solves a lot of problems, but creates some risks and requires a lot of stuff to be thought about and ironed out. The time to do that, obviously, is before putting out the capital expense. You can’t really even decide what kind of cloud you will have until these discussions have happened. It is either going to be 100% in the public cloud (not likely for most companies), or 0% in a public cloud and 100% in an internally grown cloud, or some combination of the two.

The problem with the latter two scenarios, where some form of internal, home grown cloud is in the strategy, is that the internal cloud needs to be designed, built, deployed, and operated by the same infrastructure folks that have spent the past two decades convincing themselves that a 4 month infrastructure procurement and deployment cycle is perfectly fine… and the same infrastructure organization that at this very moment probably can’t provide transparency into the exact makeup of your allocation charges.

I don’t mean to sound critical of infrastructure organizations, but the scenario I just described exists all over the place in the largest corporations across the globe.   As a result, senior managers in infrastructure now have to figure out how to build a cloud that is as mature and well-thought out as Amazon’s EC2 whilst starting a decade behind.

You are competing with Amazon’s mature, battle tested cloud solution because your AppDev teams have been playing with it at home, or they came from jobs that could leverage Amazon’s public cloud. They know that procuring a Windows or Linux box should take about 8 minutes. They know that they can expect to be charged 17 cents and hour, and only be charged when the server is powered up. They know that they need to design their apps to scale out and have the app know how to create more servers in the cloud to accommodate workload peaks and scale down to reduce cost.

All of that means your cloud needs to be able to do all that. Automated provisioning and decommisioning, fully baked cost allocation models,  an API, etc. You have to be able to scale the capacity of the underlying hardware infrastructure accordingly and make that transparent to the app layer — because a cloud that is “all filled up” and can’t accommodate new provisioning/growth requests for four months while more SAN shelves and/or hypervisor hosts are being procured would take the business right back to the old days of yore.

 

Tom C