Linux as a Desktop OS in 2019

I doubt there are many people that haven’t heard of Linux in 2019. For those that haven’t, Linux is an operating system created in the early 90’s by a University of Helsinki student named Linus Torvalds. He created a Unix-like operating system as part of a curiosity project while studying operating systems, and let’s just say it was well received. Today, in Fortune 500 companies, the number of Linux servers exceeds the number of Windows servers, and Sun servers have pretty much become extinct.

In the desktop space, Linux was not highly viable up until the past 10 years. Today, it is quite viable for both desktops and laptops. Hardware compatibility is very wide, and the far majority of applications are built for Linux in addition to Windows & Mac. However, even in 2019, there are applications that only work in a Microsoft Windows environment. You can guess that Microsoft Office is one such application set. I will talk more about that later.

Before we jump into the things you would gain by running Linux as your desktop or laptop OS, let me just say this blog is not intended to be a “how to” guide and instead is more of “why to” guide.

Why Linux on the Desktop — What Do I Gain?

To put it simply, you gain freedom, security, personal privacy, performance, stability, supportability, lifecycle management, and cost savings. These things are simply inarguable, but there is one more which could be argued: you gain coolness.

Freedom

By transitioning to Linux as your desktop OS, you are no longer locked into a proprietary OS from a vendor that decides what features and technology stacks you need. You run what you need and want to run. If you want to store documents on a cloud, you can decide to do that, but you won’t have OneDrive forced upon you and taking up system resources when you don’t even want it. Same thing with Skype, or OneNote, or Cortana.

Personal Privacy

Speaking of Cortana, you no longer have every word you type spied upon and transmitted to a vendor so they can tailor ads and services to you… and you hope and pray the vendor never gets breached or simply decides to violate their own privacy policy — or have a legal army that they use to blatantly violate your privacy and get away with it. Yes, you can turn off these scary things in Windows 10…. but what if the “turn off ability” is broken?

Security

You no longer have to wait for that vendor to decide to push patches to you one a monthly or quarterly basis. To be fair, Microsoft does do emergency patch releases to the most critical threats.

You also get an OS that doesn’t need a virus scanner running 24/7 — usually. The far majority of malware targets windows systems.

In terms of security, you get an OS that was designed and built from day one to be secure. Linux is secured in its default state and Windows is not. This empowers users with more power than they should have to inadvertently do dangerous things. Here are some more reasons.

Lastly, security is increased and threat risks lowered simply because Linux distributions tend to run only the services and software that the end user specifically asks for. As an example, by default, there is no file sharing services running on a Linux box unless that has been deemed necessary and installed. Windows 10, on the other hand, installs with file sharing enabled and services running. This means whether the user intends to share files, the services are installed and running and presenting a risk of a bad actor or malware on the network penetrating those services. The fewer services you run, the smaller your attack surface and the lower your security risk. An attacker cannot attack a service that isn’t installed or running.

Stability

You no longer have patches pushed to you which have escaped wide testing and which might make your system unbootable or unstable. Here is an example when a  recent emergency patch made some Lenovo laptops unbootable. Here is another example when an emergency patch for mitigate “meltdown” threat made some AMD system unbootable emergency patch made some AMD systems ubootable. Then there was the time a Microsoft patch put systems with Intel SSD’s into a crashing boot loop.

Despite its track record that shows serious patch release management practices, Microsoft still feels it is best to manage patches for the user. What this means is you get patches when Microsoft wants, and can potentially be broken at a time when you couldn’t afford to have downtime.

That isn’t to say that there have never been patches for any Linux distribution which induces problems. When it happens it is pretty rare. But at least you get to choose when to risk it, and you can roll back from it. Which reminds me to mention that in most cases you can’t roll back from a bad Microsoft update.

Microsoft also takes control of what drivers are used for the devices in your system, and in many cases the drivers deployed to end users via Windows updates are a couple versions behind the most current. What this means is you are not benefiting from the device manager’s development efforts. I was working an issue on one Window 10 laptop which had very unstable wireless and would lose all ability to connect until the device was power cycled. I was able to resolve this by installing the most recent driver from the device manufacturer, which would resolve the issue until Microsoft sent updates and effectively rolled the driver back to the problematic one.

Performance

The Linux kernel is far more efficient at managing hardware resources and multitasking/threading, and the end result is things just run faster. There are plenty of examples where people have done performance testing and benchmarks to compare Windows and Linux.

Aside from the kernel simply being better performing, the entire end-user experience sees a performance benefit because of less bloat in the operating system and installed packages.

This means you get more bang for you buck in terms of return on your investment in PC hardware.

Supportability

I have to admit that I really have a negative view of Microsoft in the area of support, and I am sure what I write here will show that negative bias. Over the years I have seen countless times where people are experiencing a valid issue, and they go to a Microsoft community board and post their issue in a very clear and concise manner, which demonstrates they have spent appropriate time before seeking support, and the only thing that comes back from a Microsoft representative is something along the following lines:

“My name is [NAME HERE] and I am sorry to hear you are experiencing problems with [PROBLEM SUMMARY HERE]. I am happy to say I can assist you with this problem. Please do the following:

  • In the search box on the taskbar, enter Command Prompt. Press and hold (or right-click) Command Prompt (Desktop app)from the search results and select Run as administrator.
  • Enter DISM.exe /Online /Cleanup-image /Restorehealth (note the space before each “/”). (Note: This step may take a few minutes to start and up to 30 minutes to run and complete.)
  • Enter sfc /scannow (note the space between “sfc” and “/”

The Microsoft representative almost always chimes in as soon as the thread is started, and has some variation of the above. In 99% of the times, the poor user goes off to do these things, and comes back and says “Hi, I did that and the problem continues… what now?”

What happens next is crickets chirp while waiting for the Microsoft representative to advise next steps, which in most cases doesn’t come; but if next steps come, it is almost always “go into System Restore in control panel and choose the option to Reset This PC”.

That is pretty much the Microsoft support troubleshooting script: dism /restoreHealth and then sfc /scannow…. then reset. For those not aware, “Reset This PC” is a drastic step which, while it does preserve the users’ documents it still is a long process and ultimately, in many cases, still doesn’t fix the problem; because the problem is within Windows and appears even on a fresh install.

How most problems get fixed in Microsoft land is eventually a Microsoft Certified Professional that spends his/her own time helping the community, will come along and post the real resolution or work around.

Over the years I have seen this so many times that it really makes me angry. There is nowhere you can go to see what known bugs are out there, search, and track to see if someone at Microsoft is aware and what the disposition is…. and if it is verified and will be patched, when will that happen.

No, not with Windows, but with the major distributions of Linux, the public does get access to the issue tracking system, can report issues, search, etc. Here is the tracker for Ubuntu.

What Linux users do when they face an issue which doesn’t appear to be within their own control (e.g. their own mistake or mis-configuration), is go to the appropriate issue tracker and see if their problem is already reported, and if so how is it resolved, how can it be worked around, etc.

To really boil it down to the most simple elevator pitch, the free support you get from the community surrounding your completely cost-free Linux distribution is miles better than the support you get from the vendor you paid $139 for a stable operating system which empowers your work rather than impedes it.

Cost

It seems plainly obvious that an operating system that costs $0 is a cost save versus an operating system that costs $139 to license. However, the cost save is actually much greater.

Because running Linux almost always is done in tandem with moving away from all proprietary software and embracing open source alternatives, then there is also the cost save associated from not needing a Microsoft Office license, or Photoshop (GIMP > Photoshop), or 3D Modelling (Blender rules).

In addition to the cost save on the OS and software, you also have the cost gains associated with the increased performance, or the less downtime, etc. Some would agree that you can’t put a pricetag on having an operating system that doesn’t spy on you, isn’t a breeding ground for malware, or is a ticking bomb waiting for an attacker to gain access to your system by exploiting Windows vulnerabilities.

Lifecycle

The leading distributions of Linux have a support timeframe that is much longer then Windows. This means you can run it for many more years and the OS and all the software packages continue to be maintained, and bugs fixed, and security holes closed.

The useful lifespan is increased because of the support cycle, and also because Linux is higher performing, less bloated, and therefore less hardware demanding. To put it simply, as the years go by and more things you probably won’t use are crammed into the Windows OS, those things consume disk and memory and CPU cycles. Linux distributions are intentionally managed so as to not cause this kind of bloat.

There Have To Be Downsides – What Are they?

Complexity and User Learning Curve

Obviously, many of the benefits mentioned also carry downsides if examined from the opposite direction. For example, since Linux is less prone to virus/malware spread because it is not so easy for the user to run a script or executable, the converse of that is that it is — well — harder to use overall. While that is true, it isn’t that much harder to do the things you want and need to do.

Some would disagree, but I still can’t see it happening to manage a Linux system without ever dropping to the command line to get things done. Yes, there are GUI apps that provide access to the command line features, I still can’t say that one should plan on running linux and not plan on learning how to use the command line to get basic things done.

Also, earlier we talked about how Linux doesn’t have all software installed and running by default. We said that was a good thing from a security perspective. But what do you do if you really do want to run an apache web server?

The answer, assuming a Debian derivative distribution (Ubuntu, etc.) is you have to install the Apache Httpd package, and if you are using the built-in firewall you have to configure that to allow httpd traffic. Here is a guide: https://www.digitalocean.com/community/tutorials/how-to-install-the-apache-web-server-on-ubuntu-18-04-quickstart

If one plans to run Linux but doesn’t plan how to follow instructions like this and carry out commands, that is not a good plan. That said, luckily most distributions don’t enable UFW (firewall) by default, so you don’t have to do that [art f the guide, and it boils down to really one command line: “sudo apt install apache2”.

When that command is executed, the system installs any dependent packages that are required and installs the Apache web server in a very basic configuration, which the user can then customize to work as they require.

Everything in Linux is modularised like this, and while it is a good thing from many perspectives, the bottom line is it has a slightly higher learning curve.

The good news is, everyone that is now an expert in Linux and whose guides you will read and follow, they once were also as inexperienced as you. And when you suck it up and dive into it, it won’t be long before you are a much more well-versed computer user.

Hardware Incompatability

If your system, and the devices in or connected to it, are less than 10 years old, you can expect the major Linux distributions to install and detect your hardware and provide and “up and running” experience.

That said, it would be wise to inventory your hardware and specifically research and verify that there are Linux drivers for everything. It used to be that when new hardware would hit the market, on day one it wouldn’t have Linux driver or support and all you could do is hope. It really isn’t like that anymore.

I would be remiss if I didn’t mention that some hardware has been “crippled” so that it can never run on Linux. It is very rare these days, but what used to happen is a device would offload all or part of its firmware to software within the Windows driver, and not make a Linux driver and also not make the firmware open source.

What that means is that it becomes illegal for anyone to even try to reverse engineer the driver to make it work under Linux. This kind of practice used to be somewhat common, but these days is not common as the community has reformed all the major industry players to make “open source” part of their guiding principles. If you are unlucky and have a hardware device that this is being done, you didn’t want to run that device anyways.

You Require Proprietary Software

We will talk about this in great depth next, but if you think you cannot run Linux because you require software that isn’t available on the Linux platform, there are ways around this. So let’s just dive into that now.

What Can I Do If I Want to Run Linux But Must Have Windows Apps Like Microst Office?

It used to be that if you needed Microsoft Office, that was a stopper for adopting Linux as a desktop OS. But over the past few decades a few things have happened that make that less of a stopper: the Windows Emulator (WINE), open source alternatives, and virtualization.

Windows Emulation – WINE

WINE is an emulator for running native Windows applications on Linux. A lot of applications work fully under WINE, some work partially, and some won’t even run. The reality is, you might get Microsoft Office 2007 to run under WINE.

Employ Open Source Alternatives

A better way of handling Microsoft Office under Linux is to avoid it altogether. There are several open source alternatives to Microsoft Office. LibreOffice my favorite. Other options include FreeOffice, WPS Office, Open Office, etc. Any of these alternatives will provide the core functionality of Microsoft Word, Excel, and Powerpoint. Of course, you could also migrate away from Microsoft Office by using the Google Docs apps. That said, you won’t find 100% compatibility. For example, if you are dependent upon VBA macros embedded in MS documents, that is going to be a problem — and it is a problem in itself that needs to be fixed.

It is important to point out, that you do not need to run Linux as your operating system to benefit from the MS Office alternatives. Even if you don’t plan on running Linux, it is still a good idea to decouple yourself from the Microsft Office products. I used to subscribe to Microsoft Office 365 until a few years ago, but terminated my subscription because it wasn’t gaining me anything I can’t get from open source alternatives, and it really did nothing but pollute the Windows event logs.

Virtualization

The last option for dealing with applications that create a stopper for adopting Linux is virtualization. Virtualization is a technology which allows one to create a “virtual” machine (VM) running within the operating system being used. To put it simply, if you are running Linux, you can create a Windows VM, and then within that VM you install and run the software. Windows 7 and 10 run well as virtual machines. Windows XP, 95, and 3.1 also can run as VMs, but why would you?

An interesting note is that Windows 7 professional used virtualization technology to provide 100% backwards compatibility with legacy applications. Most legacy applications could run under Windows 7 by employing a few tricks, but some could not run in any way. Windows 7’s XP mode used Microsoft’s Hyper-V virtualization technology to provide this last-ditch compatibility mode. Launching programs installed in this mode took longer, because of the time it takes to spin up the VM engine, and also ran slower because of the performance hit that came from running the VM engine and the VM.

“Virtualization” Does Not Mean “Slow”

If your saying virtualization was slow in Windows 7, why would I ever want to use virtualization to tear down stoppers to running Linux? Because in 2019, modern CPU’s that have been produced in the last five to eight years, all have virtualization features that drastically reduce the performance overhead. If your CPU is Intel, it should have VT-x. If you have an AMD CPU, it should have AMD-v.

Note that on older systems, the virtualization features might be available in the CPU but not enabled in the BIOS, and therefore it is necessary to go into BIOS and find the setting and enable it. If you are unsure, there are plenty of diagnostic apps that you can use to verify that your system supports VT-x or AMD-v. Or, you can install the virtualization software and it will warn you if VT-x or AMD-v are not found.

You can still virtualize without CPU VM features, but you cannot virtualize a 64-bit operating system and you can expect it to perform poorly.

Virtualization to Run Native Windows Apps

Using virtualization to run native windows apps is as simple as installing virtualization software, creating the virtual machine, and installing the windows OS and whatever required applications.

Virtualization Software

For virtualization software, your choice is either VMware Workstation Player or Oracle’s VirtualBox. VMware Workstation Player is paid software, but you can get a license to use at no cost for your own personal use. On the other hand, Oracle’s VirtualBox is licensed under GNU Public License V2, which for your purposes just means it is free now and will remain so. The problem with using VmWare Workstation Player’s free license is that it is a more restrictive license which could be terminated if VmWare so decides. It isn’t likely they would decide to do that.

Also, VMware’s license could cause you some legal problems if the work you do on that PC is considered “commercial”. From a licensing and legality perspective, your safer choice is VirtualBox. It is important to state this is my opinion and advice, not legal advice.

You Need a Windows License

Running a virtualized instance of Microsoft Windows as a virtual machine requires a license. In other words, you have to buy it. If you are replacing your current windows OS with Linux, and then hope to use your current Windows license inside that VM, that requires some hoops to be jumped through technically, and may not be 100% legal. It may not be legal if your license for windows is tied to the physical hardware, such as an OEM version of Windows that came pre-installed on that PC.

Another option is to use an evaluation copy of windows. Microsoft has offered free eval versions of Windows for quite some time. The problem is the free license typically only lasts 90 days, and can be renewed a limited number of times. After the eval license expires, the copy of Windows becomes quite useless.

In a nutshell, right here is where it can get complicated — not technically but legally. This might be a good place for most to simply decide to cut ties altogether with that piece of Windows-only software.

But if you must continue down this path, please do your homework and make sure you have the needed Windows product code in order to install and activate Windows within the VM.

I’m Ready To Do It!

Ok, so you are ready to go for it. Before you do, I’d like to tell you what to expect. Will I have windows GUI, desktop, taskbar? If so what does it look like? I will talk about that shortly, but let me first make a suggestion.

Try it out first!

You try Linux in a couple ways. Most distributions offer an image you can put on a USB thumb drive, and then boot your system off of that thumb drive. Just be careful not to choose the option to “Install” and instead use the option to “Try It”. Don’t worry, if you click the “Install” button it would take several steps into the procedure, and a few “are you sure you want to do this?” prompts, which would clue you in that you made an error.

Or, you could install the virtualization software (VMWare or VirtualBox) under your current windows, then use that to create a VM and install Linux to that VM. I recommend this, but especially if you plan to transition to Linux as your primary OS but then virtualize Windows from within Linux. By trying in this fashion, you are essentially doing the opposite, and gaining experience with the virtualization software and validating your hardware has the necessary features to virtualize and perform well.

Where Do I Get It, and What Is This “Distribution” Thing You’ve Mentioned?

Linux is not made or sold by any one company. The Linux kernel is the part that Torvalds did and continues to evolve and maintain with a team of global volunteers, but it literally won’t do anything for you alone. You won’t even get a command prompt and you certainly wouldn’t end up with a useful system.

When we say “Linux” these days, that means the Linux Kernel plus all the “stuff” (software layers) that are needed to make good use of that kernel in a modern context. The distributions are the kernel plus the “stuff”. There isn’t one because not everyone needs or wants the same “stuff”.

Just to give some insight int what this “stuff” is, there is not one window-like GUI available for Linux. GNOME 3 is the most popular windowing system, and is comparable to modern Windows UI, except it is a heck of lot more customizable. GNOME 3, being comparable to modern Windows UI, takes a pretty significant bite out of compute resources. Just like Windows “Aero” themes, GNOME 3 has transparaency, animations, etc. But also like Windows Aero, it requires hefty CPU and GPU. If one is less interested in the glitz and glam, and the compute resources it eats up, one might like XFCe better than GNOME 3.

XFCe provides a windowing system but is designed to be lightweight. No transparent windows or bars, no animated windows, etc. But a system running XFCe won’t need as much horse power. But even if the system has the horsepower, perhaps the user doesn’t want that horsepower eaten up by the windowing system.

The point is, you run what you want to run, and only what you want to run and that is usually driven by the hardware you have and how much horsepower you want given to any given compute function.

It isn’t just about the windowing system. All the common software packages can vary. That is really what these distributions are. Folks bundle the Linux kernel with the appropriate software packages in order to fulfill a common set of compute requirements.

A Desktop distribution will give you a windowing system (UI), system utilities, and office packages (LibreOffice), web browser (usually FireFox), etc. and a printing system.

On the other hand, a Server distribution usually doesn’t have the UI, all the utilities are command line, and it doesn’t have office software and the print subsystem is absent. But it does have Apache web server, PHP, a Mysql database server, etc.

That is really what distributions are about — giving you these varieties that best fit your compute needs so you are up and running quick after installing. DO not misunderstand though, you can start with a desktop distribution because you want a window UI and office stuff, and also add the server bits — if you need them. That really is what the flexibility of Linux is all about.

Since we are talking about desktop context here, there are a handful of most popular ones:

  • Ubuntu Desktop – The most popular/used distribution, with a specific “Desktop” blend. Probably the easiest to install and use. Uses GNOME3 desktop by default, but can be changed to XFCe for lighter weight windows system. There is a lighter blend targetted for aged/low powered hardware called LUbuntu… L for “Light”.
  • Linux Mint – Trending these days. Has its own windowing system that was a fork of GNOME. Was originally a derivative of an Ubuntu distribution called KUbuntu, with was an Ubuntu blend that used the KDE desktop environment, which is a windowing system that many preferred in the 2000’s; but even Mint uses a GNOME derivative as its UI shell.
  • Elementary OS – Another Ubuntu derivative, aims to be a “lesser learning curve” OS.
  • ArchLinux – targets x86/64 architectures only, and aims to adhere to the KISS principle, minimalism, etc.
  • Tails – This trending desktop OS specifically aims to provide an experience that focuses on your privacy and security. Way beyond the scope of this article, but people should be concerned about their digital footprint, and using Tails is one way to improve your personal security/privacy posture.

I will provide some visuals now. Please keep in mind that all of these distributions have windowing systems that are highly customizable. I am sure some of the pictures below do not depict the stock “out-of-the-box” look…. even if there is no box.

Ubuntu

Mint

Elementary

ArchLinux

Tail

How Can I Possibly Pick One?

Just do it. If you are taking my recommendation and test driving, either by using a “live” USB or setting up a VM, then you can try several and pick the one you like the most. Spin up a VM and install it and use it. See if you can get through a day by doing everything you need from within that Linux VM, and then a week.

If you must continue using thigs that are only available on Windows, then you will need to do the Windows in a VM thing. You can’t setup the Windows VM inside of the Linux VM to test how well that is going to work. Well, technically you can, but it isn’t fun and usually isn’t worthwhile to do nested VMs. But if you want to gauge how well your Windows VM will perform, it will perform roughly the same as it does when you are using Windows as your primary OS and have the Linux VM up and running at the same time.

The basic idea is your machine has to be able to run two OSes. So by leaving windows as your primary and running a Linux VM to test Linux, you are doing exactly that.

I’ve Spent Some Time Running Linux In A VM

Once you’ve spent some time in Linux on a VM under your current Windows OS, you can gauge what your life would be like if you cut over to Linux as the primary OS. If you can spend the majority of your time in the Linux VM, and only need to use Windows minimally, then cutting over is something to consider.

What About Games?

PC gaming favors Windows. I wish it wasn’t the case, but Linux isn’t a platform many studios target and that doesn’t seem to be changing any time soon. I wish it wasn’t the case, but it is. There was a time that Linux gaming was really growing and maturing, but that stopped many years ago. That isn’t to say there aren’t any games that run natively under Linux, because there are. But a serious PC gamer is likely going to need Windows for some of the games they play.

But Can’t I Play My Games In That Windows VM Under Linux?

I wish, but the answer is mostly no. Games require GPU acceleration and that is not something you can count on working well in a VM. Yes, both VMWare Workstation Player and Oracle VirtualBox support GPU acceleration in the guest VM, it just doesn’t work as well as it needs to work for modern games. If you want to try it, you best chance of success is to use VMWare rather than VirtualBox becuase VMware has better GPU acceleration.

That said, I really suggest you go into testing it out with low expectations. You might be able to get decent results with older games, but anything you want to play in 2019 isn’t going to work well in a VM.

Sadly, this means PC gamers really need to keep Windows as the primary OS, and run Linux within a VM.

What About Dual-Boot?!

Yes, you could setup to dual boot between Windows and Linux. You would need some unused (unallocated/partitioned) disk which is large enough to hold the 2nd OS. The way it works is you keep your existing (Windows) OS and install Linux on the unused space, and have the GRUB bootloader configured to present a choice of which OS to boot.

This means that you are never in both OS’s at the same time, and that to switch you need to reboot. That gets cumbersome really quick.

The other problem is, setting this up can be tricky, and in the process of trying it is very possible to make your Windows OS unbootable.

The Bottom Line

Thus, my recommendation remains to use one OS as the primary and run the other in a VM, and the one that is the primary would be the OS that you just can’t live without and there is no way to cut it out of your life.

What About Raspberry Pi and Raspian?!?

I would be remiss if I didn’t mention that another way of experiencing Linux is on a $35 device, the Raspberry Pi 3. The best Linux flavor for the pi is the distribution that is maintained for it specifically, which is Raspian. Raspian is a debian derivative (like Ubuntu) but Raspian includes only software packages that are appropriate for the low horsepower pi. For example, it comes with and XFCe based windowing system, etc. since running GNOME 3 on a pi would be extremely painful.

In my experience, using a Raspberry Pi for desktop oriented work is painfully slow. But as anything but a desktop OS, the pi is wonderful. This blog site is running on a pi!

Raspberry Pi Cluster — “Why To” Guide

This will be more of a “why to” guide and not so much of a “how to”, and certainly not a step-by-step on the setup of an individual pi. A good place to start for setting up and individual pi is the Raspberry Pi sit at: https://www.raspberrypi.org

A single Raspberry Pi v3 has enough compute power to be a decent general purpose Linux box, run a Linux Apache MySQL PHP (LAMP) stack to run a website, or be a streaming media server by running OSMC. But what a single Pi can’t do is all of the above at the same time handling a respectable workload.

Back in the day, techies would run a few Linux boxes in their den  or closet, usually re-purposed desktop PCs that were once very powerful desktop or gaming systems but then became inadequate for those purposes. While you can still do this today, you have to consider the cost of powering a full desktop or gaming PC 24/7. When you consider that, plus the cost of the air conditioning, and the noise, and the physical footprint these antiquated PCs take up, it no longer makes sense to do that.

Also, in the event of an extended power outage — such as what we experienced with Hurricane Sandy — you need your infrastructure to take up as little electrical power as possible. Running your services on low powered Pis versus antiquated desktops, you can run longer on UPS; and if you end up living on generator for weeks, your stack of Pi’s will take up negligible load on your generator.

I should clarify that the word “cluster” in the context of multiple Pi’s doesn’t end up being one big Linux box that runs across the Pi’s. It is simply a handful of Pi’s with services distributed across them.

I run three Pi’s, or now I suppose four as of today, with the workloads split out as follows:

  1. One Pi V3 running OSMC acts as the streaming media server for the living room entertainment center.
  2. One Pi V2 running a LAMP stack, plus wordpress, and is what is serving up this page right now for you.
  3. One RP V2 running RasPBX , which is a distribution that specializes in making your Pi run Asterisk and FreePBX  so you can have a full enterprise level phone system, complete with voicemail, conference bridges, etc.
  4. My newest Pi V3 will be a general purpose box and playpen.

Usually you acquire multiple Pis over time buying one, then another, then another, etc. Over time you end up with a pile of Pis just hanging about and a mess of cables. Or perhaps you bought a case for each one, but you find they don’t stack well and still are a mess that is hard to manage.

To solve that problem, I recommend the Dog Bone case by Geaux Robot, sold at Amazon here: https://www.amazon.com/GeauxRobot-Raspberry-Model-4-layer-Enclosure/dp/B00MYFAAPO

This case will accommodate four (4) Raspberry Pi’s model 2 or 3. They also offer a 2-pi or 3-pi version. This will keep your Pi’s stacked neatly and nicely as a single unit, while providing good airflow for cooling and it looks nice and techy.

Next, you need to consolidate all those power cords. If you bought a power cord for each Pi, then you find that they don’t work nicely with a UPS power strip and take up too many slots. Now, you do need a UPS, but there will be more on that later. If you buy a powered USB hub that can provide 2A of power to each port, then you can plug that USB hub into your UPS, and plug the pi’s into that via UPS cords. The net result is you end up with only one power port on your UPS for the entire stack of Pis.

I used this, which is a power port designed for delivering power, not a USB hub: https://www.amazon.com/gp/product/B0115MVRO4

I condensed my stack of Pi’s onto this power source as well as three NetGear switches, so that all of those devices consume only one port on my UPS.

Why do you need a UPS? Because the Raspberry Pi ‘s SD card can become corrupt if power is cut without a graceful system shutdown. What that means is if you lose power, your Pi may not boot and you have to spend time recovering the image — or starting with a new one and reconfiguring your services.  Plus, your Pi’s will become an integral part of your home infrastructure which you don’t want to lose service if it can be avoided.

Before I move off the topic of power, it is important to point out that stable power is critical. If your power source cannot provide at least 15.A  to each Pi, you run the risk of the Pi locking up, crashing, rebooting, etc. Again, this would run the risk of corrupting the SD card of the Pi, because the reboot would be without a graceful shutdown.

For the network, you really should get a 5 or 6 port gigabit switch. That would be 4 ports for the Pi’s and another that uplinks to your other switches. You can, of course, just plug the Pi’s into an existing switch if you happen to have enough ports available. But segregating the Pi’s onto their own small switch is cleaner because you can strap the dog bone case to the switch and handle and manage as a single unit. Usually you buy, or make your own, very short patch cords.

A word about wireless — avoid it if you can. You can use it of course, but each Pi will lose signal every so often, or you will end up with your wireless net oversaturated. Over the past couple years I have re-worked my home infrastructure to put all infrastructure back on wired Gig-E, as well as networked cameras and desktop PCs and gaming rigs, and the only things consuming wireless are the mobile devices and everyone is happy and fast.

I recommend using the “Raspian Lite” image instead of the default image, which includes the X windows GUI and desktop environment. If you install that, you will play with it long enough to realize it isn’t a viable desktop environment for everyday use, and then you spend time figuring out how to uninstall all that bloat. I recommend to install the lite version and then only install on top of that the services you really want and need.

Typically the only thing I install, other than the packages required for the services planned (Apache Httpd, MySQL, etc.) is webmin in order to  manage and administer the system.

You may have picked up that I do not have firewall services running on any of the Pis. I love pfSense, but as far as I am aware it still is next to impossible to make run on an ARM based Pi. It might be possible these days, but just not something I want to spend time on. I run pfSense on a dedicated Zotac C Series CI323, which is a mini x86 system that is fanless and draws little power.

None of my Pi’s are directly on the public Internet, but I do have firewall rules to pass the traffic that is appropriate to each service on the pfSense firewall.  Therefore I do not talk much about security here, but it is very important to consider. I personally would not put a Pi directly onto the public Internet, but that doesn’t mean it can’t or shouldn’t be done. But if you go that route, be extremely diligent managing and applying patches and hotfixes, strong password, and some form of protection like fail2ban.

Tom C’s 10 Steps To Make Your Windows 10 Upgrade Succeed

For weeks, your windows 7/8 has probably been nagging you to upgrade to Windows 10, and perhaps you have been hesitant to take the plunge because of the media reports on how “buggy” windows 10 is. This article will help you prepare for this upgrade in a manner that lowers risk and makes it a good experience.

It is a normal part of the operating system new release life cycle for a major OS release to contain major bugs. Providing the ability to do an in-place upgrade of as previous version of the OS, and not having to re-install all the apps, increases the complexity of the roll out simply because the previous OS may have been patched/hacked (by OEM or the user) in ways that the new OS doesn’t know how to handle. To put it bluntly, upgrading a sick operating system often results in a sick operating system. In order to prevent issues during your upgrade experience, you need to do the following:

1) Perform due diligence that all your hardware devices are supported with either drivers for windows 7 or there are windows 7/8 specific drivers. Be very cautious of hardware that has only a driver that targets Windows XP architecture, and consider what that means to you if that device doesn’t work in windows 10.

2) You should not have to worry too much about applications from major vendors (Microsoft, Adobe, etc) but applications that are grown by individuals or indie publishers (who may not even be around anymore and the software was last built in 2008) need to be considered as a risk — what does it mean to you if those apps no longer work?

3) Make sure your storage hardware is supported in Win 10! Yes, this technically falls under item 1, but you need to be sure that your storage is expected to work in Windows 10. If you have a motherboard that supports RAID, and have a bay full of drives configured under that RAID technology, if Windows 10 does not work with that RAID technology you will face a huge issue — your disk will appear to be missing or empty, and windows 10 might assume it is a blank disk and format it!

If your storage device is external and connected via USB, then also perform due diligence that your computer or motherboard maker provides USB drivers that work in Windows 10 — or be certain the computer’s USB ports don’t need a special driver. This is mostly a concern if your computer/motherboard supports both USB 2.0 and 3.0, usually there is a proprietary driver involved which is provided by the motherboard or chipset maker.

If your storage hardware is just an internal standard SATA drive you shouldn’t have any issues.

4) Make sure you have enough disk space. The windows 10 in-place upgrade process will make a backup of your windows 7/8 configuration so that you can roll back if needed (must rollback within 30 days). Obviously this requires disk space overhead. As a general rule of thumb, you should have 50GB free before considering an OS upgrade.

5) Uninstall programs that you don’t really use much., or you can reinstall without too much hassle. Why rely on the in-place upgrade process to migrate apps that take 20 seconds to reinstall?

6) Ensure your existing OS is healthy. The details for this would be a separate long article, but suffice it to say that a sick Windows 7 or 8 might make your Windows 10 upgrade fail miserably without any way of rolling back.

7) Uninstall your antivirus. They always say to do that before installing any piece of software, and no one does, but invoking an in-place upgrade to uplift your OS isn’t something you’d want to risk your antivirus intervening and shutting down critical processes and failing the upgrade.

Also, I’ve seen cases of the antivirus not working very well after an in-place upgrade, and removing and re-installing after windows 10 upgrade fixed that.

Obviously, you should also make sure your system is free of virus and malware.

8) Backup your “My Documents” to a cloud service or backup device. The in-place upgrade should not harm your files, but why chance it?

9) Don’t forget to consider drivers from the OEM PC maker, especially on laptops, such as hotkey services or power management. If there are not windows 10 versions of your hotkey service, then once you are on Windows 10 the vendor specific buttons on your laptop or keyboard may not work. Windows 10 should handle power management, but some devices like Netbooks that rely on aggressive power management might not have the battery longevity that you had prior to upgrading to Windows 10.

10) Make sure your wireless network adapter is expected to work in Windows 10, and has a Windows 10 or Windows 7/8 specific driver. Yes, this falls under  item 1, but pay particular attention here because if you upgrade to windows 10 and upon booting the wifi card does not work, Windows 10 may not handle that very well.

I saw this on a system, and the first boot into windows 10 it ran extremely slow and was unusable.  Once the windows 10 drivers were installed it was fine. But achieving this was difficult because the system was bogged down so much. It literally took 2 minutes each time you click something or type a key for it to take effect, and the start menu did not function at all so it was hard to get to the “control panel” or “device manager” to remove the drivers and then reinstall.

Post install — once you have done the above and invoked the install and it has completed, your first order of business is opening device manage and making sure you have no yellow or red exclamation’s, all your devices are listed, and then checkout your system thoroughly.

If your system is not performing well, reboot it (properly) as I have seen it happen many times where the first session was very slow but a reboot fixed it.

Once you are convinced everything works, install your virus software and re-install the apps you may have removed in the process above. Invoke “Windows Update” and force it to look for updates, apply them, and reboot if needed.

If you find your Windows 10 does not work well, e.g. slow, crashing, etc. and you can’t resolve the issue then you are left with two options: reset the windows 10 or roll back to Windows 7 or 8.

Resetting windows 10 would basically blow away all the apps and the windows 10 installation (which was upgraded from a previous version) and install a clean windows 10.  However, if you choose this option, you lose the ability to roll back to your previous OS. Take this option if you are confident you are to remain on Windows Otherwise, invoke the roll back and hope that works.

If you are unable to get Windows 10 working despite your best efforts, visit http://www.ubuntu.com

Review of Windows 10 Technical Preview

windows10

Windows 10 is out for technical review by anyone desiring to do so. Simply go to Microsoft’s site and look for the trial program, and sign up to get an ISO that will install Windows 10 with a trial license. I would strongly recommend to not wipe your existing OS to evaluate this, and instead create a VirtualBox VM and install to that. VirtualBox is a produce from Oracle which is free for use, and allows you to virtualize a PC to run another OS from within your current OS.

So far I can say it looks promising. The desktop is back to being the center of the OS, and the start menu is back completely. It looks like Windows 10 is a nice shift back to focusing on the things you need in a desktop OS, instead of trying to change your desktop into a big tablet.

Microsoft is famous for following a pattern of “great operating system”  followed by a flop, and then “great operating system”. Windows 8 is to windows 7 what Vista was to Windows XP. Hopefully those that purchased Windows 8 will get a free or very low cost upgrade.

Tom C