The Missing Primitive


There was a time when people simply shared code. In the 1960s and 70s, researchers at universities would mail one another whole programs, sent hand to hand, the way scholars once circulated manuscripts. AT&T had built Unix and licensed it to universities for next to nothing. Nobody thought of software as a product back then. It was more like an idea that happened to run on a machine.

And then, someone realized there was money in it. AT&T began selling Unix commercially. Lawsuits followed. Berkeley had taken Unix and added virtual memory, added networking — but the legal trouble cast a long shadow, and for years the work stopped. During that quiet window, Linus Torvalds sat down and wrote his own kernel from scratch. He later said that if Berkeley’s version had been freely available, he probably wouldn’t have bothered. One of the most important pieces of software in history exists because of a lawsuit.

But here is the remarkable part. While all of this was happening, a man named Richard Stallman had been patiently building the other pieces since 1983. A compiler. A shell. A build system. He and his collaborators had everything except the kernel — which was the one thing Torvalds had just written. When the kernel arrived, it fit into place, and suddenly there was a complete operating system. Nobody had planned it that way. People simply built what was missing, and the pieces fit together.

That, it turns out, was what really mattered. Not that the code was free, though it was. Not that anyone could read it, though they could. What mattered was that you could take a piece from here and a piece from there and make something new. Over time, package managers turned the world’s code into a vast parts catalog — decades of accumulated work, one import away. Each piece built on the last. The work compounded.

Now, if this story sounds familiar, it should.

For a long time, AI research was open in much the same way. Researchers published what they found, and others built on it. Frank Rosenblatt told the world exactly how his perceptron worked in 1958. Geoffrey Hinton’s backpropagation spread through the field in the 1980s. And in 2017, a team at Google published the transformer — under a title that almost smiled at you: “Attention Is All You Need.” Here, take it.

And then, once again, the models got good enough to be worth real money. OpenAI — with “open” right there in the name — built ChatGPT on top of Google’s published transformer. By 2023 its own chief scientist was saying that the openness had been a mistake. Google had given away the key idea, and someone else built the business. The door was swinging shut, just as it had a generation before.

But here the story takes a different turn. When AT&T closed the door on Unix, it didn’t really matter, because code comes in pieces. Torvalds could write a kernel. Stallman could build the tools around it. A teenager in a dorm room could write a web server. Each piece worked on its own and with the others, and together, over years, they overtook the proprietary systems. It wasn’t the openness of any single piece that mattered so much as the fact that they could all be snapped together.

A model’s knowledge lives in its weights, spread through the whole network. You cannot point to the part that knows chemistry and lift it out. You cannot take it apart and recombine it the way you can with code. There is no function to copy. No module to share. No import.

We do have some tricks. You can distill a large model into a smaller one. You can carry what a model has learned into new work. These are useful things. But they are not the same. They do not let a thousand people, working separately, each add a small piece that clicks into a larger whole and grows over time.

That is what I think of as the missing primitive. Open source gave us a world where one person could write a function, another could wrap it in a package, and over decades the whole thing grew into something no one could have built alone. We do not yet have that for AI. Some models are open. But the knowledge inside them does not snap together. It does not compound. And I wonder — what would it look like if it did?