A.K.A. Tech Bro Tinfoil Hat Time#

It’s a new year, and I have decided to write something new. Well, not new, it’s pretty old, actually.

Over the years I have written about silly things like Bittorrent and Linux. But lots of people write about that. The only original thought that I have had, at least one relevant to a nerdy weblog, is a collection of old computing industry/information technology pet theories that I have heard about over the years but not seen much of a big deal made about them.

So here, in no particular order are a collection of “Computing Conspiracy Theories” that I have considered over the years. I call them conspiracy theories because they are based on classic conspiracy criteria:

  1. Little evidence - Most of these theories are not based in verifiable fact, but more on unanswered questions.

  2. Usual Suspects - Most of these theories blame either the federal government or a big tech corporation like Google.

  3. Confirmation Bias - Most of these theories are based on my personal biases against the government or big tech organizations. Over the years I have come to find that big tech companies suck, and this stuff just proves it.

The Cray Line of Supercomputers was always subsidized by the NSA#

Supercomputers are like the fighter jets of computing. They’re unimaginably expensive, there aren’t that many of them, and a lot of nerds are completely obsessed with them.

The primary example of this is the Cray. It’s one of the rarest brand of computers with one of the longest histories. Continuing with the military aircraft analogy, the cray is like the B-52 of computers.

Military code breaking has been a prime motivator for advancing the field of computer science. Like pretty much everything in the 20th Century, WWII had a huge impact on military and intelligence investment in computation. A lot of things came out of WWII, such as the National Security Agency.

The Cray business line has been around for 50 years. And like any storied line of computer systems, it has changed hands a few times over the years. And like Digital Equipment Corp, Cray has become a subsidiary of Hewlett-Packard. If that many companies have gone out of business, or purchased and stripped for parts, why has the line of products been kept alive?

When a lot of the supercomputing world had gone to Beowolf Clusters, why has the Cray managed to survive?

When you read the news about Cray systems, less than a dozen are said to exist at one time. Yet, HP paid over a billion dollars to acquire the product line. Does that mean that each Cray system costs more than $100 million to manufacture? What do they then sell for?

Or, is there an equal or greater number of these supercomputers working for clandestine organizations? Could it be that the Intelligence community is pressuring private industry to keep Cray computers around?

Google, not Moore’s Law, motivated Intel’s development of the multi-core processor#

I don’t know how well it is known, but big data companies like Google and Facebook build their own servers, rather than using a vendor like Dell or HP. Whitebox servers are built from components purchased directly from vendors. In a data center, you will spend a large amount on servers, but you will a greater amount on networking and cooling, respectively. All 3 of those systems will require power distribution, battery and generator support which will run you at least the cost of the servers and networking. One of the great ways to cut costs in a data center is to cut the electricity bill.

When the Intel Pentium 4 came out, it was a banner year for power supply manufacturers. The P4 was the humvee of cpus. It was a single core processor with a TDP of 84 Watts. essentially meaning it used a lot of power and it ran hot. The P4 is what gave rise to enormous heatsinks and CPU fans. Today, there are Intel processors like the Intel N100 that can deliver 4 cores with a TDP of 6 Watts.

So why did Intel suddenly alter course on the P4 and switch to the Intel Core processor design? Did they hit the theoretical physical limit on processor design? Why did Intel abandon it’s iconic Pentium design and start making [https://en.wikipedia.org/wiki/Intel_Core#Product_lineup] glorified laptop processors?

Well, if Google, who is rumored to own millions of servers, and is probably your biggest buyer of CPUs… If Google tells you that your shit is too wasteful and hot, you find a way to optimize it and to cool it down. If you need a quick jolt in power efficiency, you start with a laptop processor.

If you extrapolate that out, you can see the trend in tech toward parallelized technologies, or technologies that benefit from lots of simultaneous threads. Like Virtual Reality, Bitcoin, and Large Language Models.

Microsoft’s .Net framework is copied from Java#

This particular theory is based on things I heard working with former Microsoft employees.

In the 90’s, Java was the coolest thing in technology. It was going to revolutionize everything (much like AI is today) and it seemed to get shoehorned into every blasted thing. In the 90’s most developers were using Java and Microsoft wanted Windows to be the default platform for Java.

Well, Sun Microsystems didn’t like that and there was a lot legal wrangling over the Java Virtual Machine. In the end, Microsoft did away with its Java Virtual Machine, and developed something else called the Common Language Runtime. So, what?

Well, this is where the rumors come into play. I heard from ex-Microsoft employees that the very early versions of the CLR were written by interns. That they were guided by senior developers, who had access to the Java source code, but the actual code was written by very junior devs. Also, while the original CLR was being built and tested, the test framework and all of the test tools were still in Java.

Do I have proof? No. Could the people I heard about this from have been lying or exaggerating? Yes. Is it really that big of a deal that Microsoft reverse engineered Java? I mean, it’s not like the Kennedy Assassination (which I know for a fact was done by aliens!) but it was a pretty clear win for the Microsoft embrace, extend, extinguish strategy.

The Windows2000 native TCP/IP stack was Unix code#

Speaking of 20+ year old stories of Microsoft copying things… A long time ago in the late 1900’s, networks in general were hard to get working, and connecting to the Internet was an ordeal. There was a whole industry built on crap to connect DOS and Windows PCs together via Directory Servers for local networks. If you used Windows, you needed a bunch of even less well put together shit to connect to the Internet.

The primary technology for Windows networking was their proprietary implementation of NetBIOS. In the early 90’s, the Internet connected Unix systems thru TCP/IP. Apple and Windows networks were not meant to connect to the Internet. You could dial up to Unix systems through a number of terminal and terminal emulation programs, but your personal computer connecting directly to the Internet was not a thing that happened. This was the age of BBSes and “Information Services” like Tymnet and CompuServe. In the 90’s, a bunch of confusing network encapsulation to create a simulacra of IP on Windows NT networks.

So when the world suddenly wanted access to the World Wide Web, Operating Systems needed to quickly implement native TCP/IP stacks. This was a major reason for the shift of MacOS from ‘classic Macs’ to the current generation of MacOS.

The Apple solution was to be base the new OS on a form of BSD Unix. The Microsoft solution was to use Unix code for networking and adapt it to Windows. Is it a scanadal? Not really, but when you look at the SCO vs. Linux legal drama, it’s rather hypocritical.

Microsoft bankrolled the SCO Unix copyright drama#

This is a much simpler conspiracy. Unix isn’t actually a single operating system, but rather a bunch of research projects by universities and corporations. In the dark ages of computing, there were endless legal and engineering disputes, know as The Unix Wars.

In time, most variations of Unix died off, and all that really remained was BSD, Linux, and some high end hardware vendors like Sun Microsystems, Digital Equipment Corps, and IBM.

As the web grew from a static pages to dynamic applications, tech companies like Google and Amazon found more and more use for open source software like Linux, Apache, MySQL, and PhP. Because a lot of this software was available for free, or for a very low cost, Microsoft was getting killed in the server market.

While Unix was dying off, the Unix name and the copyright around somehow managed to end up in the hands of the SCO Group. The SCO group then pretty much everyone.

The SCO group rose from the ashes of Caldera Systems, a company that got a lot of hype during the Linux IPO craze of 1999 and 2000. Caldera raised a bunch of money and bought a big name brand called SCO UnixWare. In the end it couldn’t actually sell any product, so it tanked.

Then, the SCO Group emerged, and sued everyone. How did a company that was circling the drain get the funding to re-emerge? Possibly with an investment from private equity with ties to Microsoft?

During the legal drama, Microsoft’s “Get the Facts” marketing campaign warned businesses that using Linux could lead to legal troubles. The SCO Group was suing Redhat Customers like “Autozone” and making a lot of noise about it.

Coincidence? The world may never know.

All this stuff is super old#

Yes. That is the point. Technologies, and the companies that are getting rich off of them come and go. Operating systems and office suites are not hot commodities anymore. The advent of the smart phone has dwarfed the market for computers and manually installed operating systems. Linux is alive and well on the desktop… of junior high school Chromebooks.

Today, Twitter was purchased by Elon Musk who turned it into a right wing political project. The US governement wants to ban TikTok. These are the rise and fall of big tech companies is nothing new. The decision to push other companies out of the market is also not something new. OF COURSE Instagram employs child psychology so that teens become addicted to the app. The game is still the same, it’s just more fierce.