Free Computing for Solopreneurs? - 7. Where Linux Comes From
In the previous article I spoke about Free and Open Source Software (FOSS) and why it is gaining in popularity. In this article I’m going to provide a quick overview of the poster-child for FOSS: the Linux operating system, and why it was necessary for it to come into existence.
Why Were New Operating Systems Necessary at All?
In the 1960 and 70s, the dominant computer operating system was Unix, developed by AT&T’s Bell Labs division. Because of a large anti-trust case that AT&T lost, the source code for Unix was freely distributable. This was the primary reason it was so wide-spread, especially at universities and also in the private sector. It also the only reason why we have alternative operating systems available to us today.
In 1984, a crisis occurred in the freely distributable Unix world. Bell Labs was sold and as a result it was no longer bound by the AT&T anti-trust case. Bell Labs now asked that all Unix users stop sharing the source code and began selling use rights for the software. This was not well received. While large corporations and businesses could afford to pass this cost onto consumers, programmers and researchers at universities as well as non-profit organizations had no one to pass the cost onto. More importantly, being able to share the source code was part of the Unix culture and it was in-line with the idea that knowledge should not have a price tag.
Was Unix Knowledge or a Product?
One of the outcomes of this situation was that a new software license was developed in mid-80’s to counter the actions of Bell Labs. This gave rise to several open licensing models and the creation of new operating systems based on the Unix source code from before it was separated from AT&T. These were primarily developed by programmers at UC Berkeley, so many have the BSD acronym, for Berkeley Software Distribution, in their names. The most popular modern versions of these are FreeBSD, OpenBSD, and NetBSD, and they are still actively used and maintained. As a matter of fact, they run some of the largest Internet systems today because of their stability, longevity and adherence to core Unix principles.
The new open software licensing models also allowed programmers to develop Unix-derived operating systems specifically targeted at the education sector. One of the best known of these operating systems is called Minix, developed by a Dutch computer scientist at the University of Amsterdam. Unfortunately because of licensing complications stemming from Bell Labs’ restrictions, it was not freely distributable until 2000.
Minix serves as an example of the checkered relationship between commercial and FOSS development. New lawsuits are brought regularly from companies that claim ownership over a small portion of code they say infringes on their ownership of it. As a result the FOSS community has respond with ever more fine-tuned versions of software not subject to the trappings of previously licensed and newly claimed code.
How Did Linux Work Around this Issue?
It is under the shadow of continuing legal challenges that Finish software engineer Linus Torvalds developed a Unix-type of kernel that is completely free of proprietary code owned by anyone other than the open licensing organizations. A kernel is essentially the heart of an operating system, much like a main processor is the heart of a complete computer.
Torvalds started work on this project in 1991 and it became hugely popular very quickly. By the mid 90’s Linux was so well received that even supercomputers began using Linux instead of expensive commercial Unix alternatives.
While today still only 3.37% of desktops use Linux, it started growing in use very fast the last year. It is also the dominant server OS at 35%, and is installed as the primary communications/internet server in the world, at 95% adoption. It should not be understated that the +3% usage on desktops still represents over 30 million users world-wide. Consequently that number is only 1/4 of Apple OS use, so it is not too far behind.
Why the Kernel is So Important
As mentioned, the kernel is essentially the heart of an operating system. Prior to the isolation of the kernel as a completly open source core, attempts at ensuring that operating systems were free of commercially owned code were extremely difficult. As soon as a company would claim ownership over even a small portion it, the entire system would be held up until a new court decision was reached. This process could take years.
Obviously, this made it very difficult for mission-critical systems to run any Unix derived operating system consistently – at any point it could be stopped for legal reasons. If this was a computer that was responsible for critical military infrastructure or life-saving monitoring in hospitals, these types of interruptions made the use of open sourced operating systems highly risky. Of course, the alternative was to go back to licensing a commercial version of Unix and pay for the service, something companies like Bell Labs were more than willing to promote.
By isolating the kernel as an open source base from which to develop the rest of the operating system, Linux offered a realistic alternative from which to develop the rest of the operating system. Development in unique modules, helps reduce risk. The entire operating system is less threatened when a licensing claim is made against a specific part of it. This allows the other parts of the system to continue to function without disruption. Clearly, ensuring that the kernel, the heart of this system, was free from proprietary code was of the highest priority.
Where Does Linux Go From Here?
While all software is typically developed in modules, the innovation here was that the licensing for each module follows the modular approach. This creates not just a more stable operating system overall, but also allows free software to grow in ways that are truly revolutionary. The licensing model has now been in use long enough that I believe it will eventually grow much faster than closed-source software. Actually, I think that we are witnessing this growth right now.
Of course there are still many reasons to use proprietary software, the biggest one being that some applications just aren’t available as FOSS. For example, I use an Apple phone because for my specific business needs it fits better in my day-to-day workflow. While I could also have switched to an Android phone, an operating system is based on Linux, this would also have complicated my life just enough to make me less productive.
Because so many of the services I depend on are based on commercially-produced software, I am not able at this time to switch over. This is the case now, but it could change in the future and I mention this because I firmly believe that software development is constantly changing, so as a business owner, I am always willing to consider alternatives if it will improve my workflow. It just isn’t the time for me yet.
Now before I receive a bunch of hate-mail about this decision, there is a silver lining in this. Because FOSS software is modular at the core is is also highly adaptable. It is inherently interoperable with other operating systems. This is because FOSS (and Linux developers) have always had to compete with commercial software, so they have developed software that is fundamentally more interoperable with commercial applications.
This is one more reason why I am quite confident that the open software model, which includes the open licensing model, have important advantages over commercial & closed software. As such, they are poised to take significant market share away in the future. Will it eclipse commercial software? Who knows, but it will undoubtedly become a larger part of the market, and consequently, people’s lives.
In the next post I will talk about which version of Linux is best suited for small businesses and entrepreneurs. Because there are literally hundreds of options, this may seem to be a difficult decision. In my experience, however, there are only a few you should consider for your business needs.