Wednesday, February 25, 2015
The unintentional conspiracy
So why am I typing this on a laptop running GNU/Linux, the free software operating system, not an Apple or Windows machine? And why are my phones and tablets running a privacy-enhanced offshoot of Android called Cyanogenmod, not Apple’s iOS or standard Android?
Because, first of all, I can get my work done fine. I can play games. I can surf endlessly. The platform alternatives have reached a stage where they’re capable of handling just about everything I need.
Control is moving back to the center, where powerful companies and governments are creating choke points. They are using those choke points to destroy our privacy, limit our freedom of expression, and lock down culture and commerce. Too often, we give them our permission—trading liberty for convenience—but a lot of this is being done without our knowledge, much less permission.
Via Reddit, Why I’m Saying Goodbye to Apple, Google and Microsoft — Backchannel — Medium
While I'm sympathetic to his views, I don't believe there's one vast conspiracy to restrict consumers' use of computers. Each step, taken in isolation, is understandable, but when viewed over time, can appear to be a vast conspiracy.
Take Apple's control over the applications that can run on its devices (the article limits itself to programs that run on the iPhone or iPad, but even on their general computer line, the Mac, Apple is slowly clamping down—on my work-issued Mac laptop I had to dive deep into the settings so I could run programs I wrote myself on the laptop). They do it to enhance the user experience, going back as far as 1984 with the release of the first Macintosh. A major complaint at the time was the inability to modify the computer as there were no expansion slots. But on the flip side—it made the Macintosh easier to use and more stable.
Sure, you could add new hardware to a PC running MS-DOS, but it wasn't particularly easy to do (mainly due to a hardware limitation in the original IBM PC that other manufacturers followed to remain fully compatible wherein users had to learn about device addressing and IRQ settings) and there was always the chance that the device driver (a special program for talking to the hardware) for the device (and most likely not written by Microsoft) could have bugs that crashed the machine (at best—at worse it could silently corrupt memory). By fully controlling the hardware and restricting the upgrades, Apple could ensure a better user experience for The Rest Of Us™.
And now it's more than just hardware. Compter viruses (and yes, that is the proper plural for more than one computer virus) and worms are nothing new (The Shockwave Rider, a science fiction book first published in 1975, was partially about computer worms) but with the commercialization of the Internet in the mid 90s, the threat of malware has grown to such proportions (even images can be a vector for malicious computer code) that it makes sense to severely restrict what can run on a computer and restrict what the program can do (hence my having to tweak settings on my own laptop to allow me to run my own software). And Apple, by restricting the software that is allowed to run on their equipment, can curate the software, again, to make for a better user experience For The Rest Of Us™.
There's also a myth that large companies like Apple and Microsoft are trying to prevent The Rest Of Us™ from programming our own computers. During the rise of the home computer, the 70s and 80s, pretty much every computer sold came with some form of programming environment, even if the langauge was as simple as BASIC. But at the time, that was a selling point, primarily because there wasn't the large market for prewritten software that there is today. With the rise of shrinkware, there was less need to provide a programming environment with each computer.
And frankly, the number of people who buy computers and don't want to program outnumber the people who do want to program. By a lot (I attended high school in the mid 80s and took Pascal. I can probably count on one finger the number of people in that class who are still programming today. People, in general, don't want to program). There was pressure to lower the price of computers (there was a lot of competition in the 80s and 90s) and market research probably revealed that not many people cared about programming, and hey, if the customers don't care about BASIC, that's an easy thing to remove to lower the price. No, there's no vast conspiracy to keep people from programming, just a lack of interest.
I also run my own email server. I personally don't find it all that hard, but then again, I've been managing servers since the mid 90s and by now, I know the ins and outs of running an email server (harder these days than in the early 90s, long before spam and HTML laden emails deluged the Internet) but not many people want to bother with that, and I'm including people who could, in principle, run their own email servers. It's easier to leave that to a company that specializes in running email servers.
In fact, it's easier to rent computers in a data center than to own your own computers in a data center and leave the actual hardware up to companies that specialize in running them (and yes, I'm a person who rents a “computer” at a data center because it is cheaper and I don't have to bother with buying and managing the hardware, so even I am not completely immune to convenience). But I realize there's a risk with not having physical ownership of the hardware and for now, I can live with that risk.
But a vast conspiracy? Nah. I just don't see it.