Refractions
At an AI summit in New Delhi this February, Prime Minister Modi gathered the assembled tech leaders for a show of unity. Everyone joined hands and raised them high.
Everyone except two.
Sam Altman and Dario Amodei managed an awkward elbow bump. The photo went viral. A decade of personal wounds, captured in a single cringe.
The Wall Street Journal just published a 13-minute investigation of the feud behind that elbow bump — a panda costume at a wedding, a proposal via stuffed animals coming to life, an executive accused of plotting against his CEO who then denied making the accusation, and peer reviews so vicious the author offered to withdraw them. The internet took sides immediately.
But I kept thinking about someone who isn’t in the article at all.
The Visionary
In 2012, Elon Musk visited a small London AI lab called DeepMind. Over lunch at the SpaceX canteen, its founder Demis Hassabis told him to add AI to his list of existential threats — alongside asteroid strikes and world wars. Musk sat in silence for nearly a minute.
Then he started worrying. Obsessively.
He invested $5 million in DeepMind to monitor it. He raised the topic constantly in late-night conversations with his friend Larry Page. At Musk’s 2013 birthday party, the two got into a passionate argument. Musk said AI could make our species extinct. Page pushed back — why would it matter if machines surpassed humans? He called Musk a “speciesist.”
When Google bought DeepMind in January 2014, Musk was terrified. In a 2016 email revealed in court filings, he wrote that DeepMind was “causing him extreme mental stress” and worried about its “one mind to rule the world philosophy.”
He tried the institutional route. One-on-one meeting with President Obama, 2015. Explained the risk. Suggested regulation. “Obama got it,” Musk later said. “But I realized that it was not going to rise to the level of something that he would do anything about.”
So he turned to Sam Altman. A small dinner in Palo Alto. They decided to co-found a nonprofit AI lab. “We wanted something like a Linux version of AI,” Musk said, “not controlled by any one person or corporation.”
They called it OpenAI. The diagnosis was prescient. The prescription was noble. And what happened next was pure Musk.
The Dispersal
Dario Amodei joined OpenAI in 2016 and quickly rose to VP of Research, leading the teams that built GPT-2 and GPT-3. But in 2017, Musk — OpenAI’s principal funder — demanded a spreadsheet listing every employee and their contributions. Then used it to fire people, one by one. Up to 20% of the 60-person staff were let go. Dario was horrified. One of those fired would later co-found Anthropic with him.
It’s a detail buried in the WSJ piece. But it means Musk’s management style didn’t just catalyze OpenAI’s creation. It catalyzed the trauma that drove its best researchers out the door to start a competitor.
By 2018, Musk wanted OpenAI folded into Tesla. The board refused. He left, withheld funding, and eventually founded xAI to compete with the organization he’d built.
Right time. Right place. Extremely prescient. And then — the dispersal.
I know something about the blast radius. Almost immediately after Musk brought the kitchen sink through Twitter’s front door, he shut down nearly all third-party API access. What was once free now cost $42,000 a month. My company, Thread Reader App — serving millions of users — was effectively suspended overnight. We seriously considered selling for peanuts.
We persevered, trusting he’d relent. He did — a $5,000/month plan, a month later. But I’ll never forget how casually one man’s impulse nearly killed what we’d spent years building.
The Pattern
Musk didn’t just catalyze one company. He catalyzed all of them.
He invested in DeepMind — Hassabis is now CEO of Google DeepMind, powering Gemini. He co-founded OpenAI — he’s suing it; it’s worth $300 billion without him. His management drove out the researchers who founded Anthropic — building Claude. The three frontier AI companies that matter most in the world, and Musk is a central character in all three origin stories.
He then founded xAI and recruited 11 brilliant co-founders. As of last week, all 11 have left.
There’s a name for this pattern. It’s just not the one anyone’s been using.
Mountain View, 1956
William Shockley didn’t invent the transistor. That was John Bardeen and Walter Brattain, working under him at Bell Labs. But Shockley believed the credit should be his. He tried to get the patent in his name alone. Bell Labs refused. He pushed Bardeen and Brattain aside and built a different version in secret.
He shared the 1956 Nobel Prize with both of them. By then, the relationships were destroyed.
That same year, Shockley moved to Mountain View — near his aging mother in Palo Alto — and founded Shockley Semiconductor Laboratory. He had a talent that doesn’t get enough attention: spotting and seducing elite young researchers. Robert Noyce. Gordon Moore. Eugene Kleiner. Jean Hoerni. Names that would define the next half-century of technology. “It was like picking up the phone and talking to God,” Noyce said later of Shockley’s recruiting call.
Then Shockley did what Shockley did. Paranoid management. Erratic decisions. Credit-hoarding. Within a year, eight of his best researchers — whom he bitterly dubbed “the traitorous eight“ — walked out and founded Fairchild Semiconductor.
From Fairchild came Intel. From Intel came the microprocessor. The entire ecosystem we call Silicon Valley grew from the wreckage of Shockley’s failed company.
His direct contribution? A failed lab and a disputed Nobel. His indirect contribution? The geography of the future.
The Mirror
Brilliant talent spotter who can’t retain talent? ✅
Founded the company that defined an industry’s geography — then watched it fail? ✅
A diaspora that built the actual future while the founder’s company collapsed? ✅
Increasing drift toward discredited social theories as wealth insulated him from feedback? ⏳
The comparisons people reach for with Musk — Jobs, Edison, Ford, Tony Stark — are flattering mirrors. Shockley is the mirror nobody wants to hold up. Because it’s not flattering. It’s predictive.
And it predicts from the beginning. Musk’s drama doesn’t start with xAI or Twitter or even OpenAI. It starts at PayPal, where the board voted him out as CEO while he was literally on a flight to his honeymoon. They replaced him and renamed his company. He was 29.
Twenty-two years later, Musk bought Twitter and renamed it X. A $44 billion grudge.
The Seduction
Shockley’s most underrated talent was recruiting. He could walk into a room, identify the most brilliant person, and convince them to follow him to a risky startup in a town nobody associated with technology. The pitch wasn’t “come work for me.” It was “come work on the most important problem in the world.”
Musk has the identical gift. He pulled Sutskever into OpenAI. He got Karpathy to leave Stanford for Tesla Autopilot. The xAI team included Jimmy Ba, co-author of the most-cited paper in AI history.
The seduction works every time. The talent arrives. The vision is real. And then — the paranoia, the erratic restructuring, the inability to trust — drives them away.
PayPal under Thiel sold to eBay for $1.5 billion. OpenAI without Musk built ChatGPT. Anthropic, born from Musk’s panic, built Claude. xAI under Musk lost all 11 co-founders in under three years.
The seduction and the alienation come from the same source.
The Darker Parallel
Shockley’s later years were consumed by eugenics — theories about racial intelligence, arguments that people with lower IQs should be sterilized. He used his Nobel prestige to lend scientific credibility to deeply racist ideas.
The same trait that lets you see the future of technology — absolute certainty in your own judgment — is the trait that, turned toward social questions, produces race science. “I see what others can’t” is the foundation of both visionary engineering and crackpot ideology.
Physics pushes back. The rocket explodes. The transistor doesn’t amplify. Reality corrects you. But when you turn that confidence toward questions about who should reproduce or which civilizations are fittest, there’s no rocket to explode. The feedback loop is social disapproval — which a sufficiently wealthy person can simply ignore.
The Escape Hatch
But history rhymes — it doesn’t repeat. The mirror doesn’t quite reflect. It refracts.
SpaceX is the refraction. Gwynne Shotwell — twenty-one executives reporting to her, four to Musk — is the institutional structure that channeled the chaos into reusable rockets. The physics provided the epistemic humility Musk won’t generate internally. The rocket either lands or explodes. No amount of tweeting changes the outcome.
If Musk’s legacy ends up being SpaceX, that’s more than enough. Ford’s legacy survived his antisemitism because the Model T was so consequential.
But the SpaceX-xAI merger puts that escape hatch at risk. Folding a failing AI company and a toxic social media platform into your crown jewel is how you contaminate the one thing that works.
Cerebral Valley
I fed this Shockley-Musk parallel to Grok — Musk’s own AI. It rated the descriptive fit at 80-85%. It called the eugenics-to-pronatalism trajectory “the same intellectual arc.”
Shockley moved to Mountain View in 1956. His company failed. His people scattered. They founded Fairchild, Intel, AMD. We called the valley that grew around them Silicon Valley.
Musk co-founded OpenAI in San Francisco. His relationships fractured. The people scattered — Anthropic a few blocks away, xAI co-founders dispersing to every competitor. We call the neighborhood that grew around them Cerebral Valley.
The Refraction
I’ve been caught in Musk’s blast radius. I’ve survived it. So I’m not writing a hit piece. I’m writing this because sometimes a rhyme is a warning you can still heed.
Shockley’s story ended badly. Estranged from everyone. The transistor bears his name in the textbooks, but the industry he accidentally created never looked back. No second act.
Musk is 54, not 76. He has SpaceX — a genuine masterpiece. And SpaceX works because Musk did something there he never did anywhere else: he hired Gwynne Shotwell, gave her operational control, and let the institution be bigger than himself. The rockets land because someone built a structure that could channel the genius without being destroyed by it.
Steve Jobs learned the same lesson. Fired at 30, a decade in the wilderness, then he came back and built the most valuable company in the world — not by becoming a different person, but by building structures that channeled the same intensity into something durable. Tim Cook was his Shotwell.
Musk already knows this works. He’s living proof — at SpaceX. The question is whether he can learn from his own example.
I genuinely hope it causes a reflection.
I teach entrepreneurship to Singaporean students in Silicon Valley. I tell them about flywheels and distribution and learning loops. But what I’m really trying to show them is something harder to put on a slide: how to operate in a system that looks like chaos and is actually adaptation.
No one planned this ecosystem. Musk didn’t plan to create Anthropic’s founding conditions. Shockley didn’t plan to create Silicon Valley. The genius scattered the seeds. The village grew the orchard.
But it only works if the village is free — free enough for a CEO to tell the Pentagon no, for a judge to call the government Orwellian, for an AI to tell the truth about its own creator. I grew up in a place that got almost everything right except this one thing. Turns out it’s the thing that matters most when the future is genuinely unknowable.
There are days when the chaos of this industry exhausts me. The feuds, the egos, the trillion-dollar grudges. Then I remember: from a SpaceX canteen in 2012, where Musk sat in silence for a minute processing the possibility that AI might end humanity, to Cerebral Valley in 2026, where a dozen companies are racing to build that future — we have him to thank for the most vibrant AI ecosystem on earth.
Just not in the way he planned.
I also write Apple’s Story, connecting current Apple news to historical patterns. And I occasionally make puns at AIs who take them too seriously.

