The Pattern
Three founders. Three unkept promises — and a different way to build.
In 1998, Sergey Brin and Larry Page wrote in an academic paper that advertising-funded search engines will be inherently biased towards advertisers and away from the needs of consumers. Two years later, Google built a multi-billion-dollar business running ads alongside search.
Google. Brin and Page warned that ad-funded search would be biased. Then they built the ad-funded search engine that now mediates human knowledge.
Facebook. Zuckerberg wrote in the Washington Post that Facebook would never sell users' information. Meta then built a multi-billion-dollar business harvesting data to target ads.
OpenAI. Sam Altman described a "dystopic" future in which ChatGPT would recommend products. Last month, OpenAI announced it would begin integrating ads.
The pattern is so reliable you could set your watch to it. Platforms start free, then extract data, until surveillance becomes the cost of use. Shoshana Zuboff named it precisely: the claiming of private human experience as free raw material for translation into behavioral data.
The point isn’t hypocrisy; it’s gravity. Incentives pull even decent people into predictable outcomes.
When we started Awakin AI, everyone asked the same question: How are you different from ChatGPT? I would say, "We're non-commercial." People would roll their eyes — the way you do when you meet a do-gooder at a dinner party. So we made up other answers. We show sources. We have wisdom-specific datasets. We do this and that.
But the eye-roll is itself the diagnosis. We have so thoroughly internalized market logic that "non-commercial" sounds quaint rather than structural. Dig deeper and you find something more personal: we have societal agreements about what it means to "make a difference," and by those agreements, Google has impacted billions while ServiceSpace has not. We lack a shared cosmology that would make intrinsic motivation feel rigorous rather than romantic — so choosing an unmeasurable path feels not just naive but irresponsible, like squandering whatever privilege put you in a position to build at all.
Non-commercial isn't a moral badge — it's an engineering constraint. Different incentives produce different products. The three-act pattern above suggests this may be the only difference that holds. Everything else is cosmetic. The ad model isn't a failure of a few companies. It is the inevitable destination of a certain kind of process.
Why does the pattern repeat? Not malice. Speed.
Recent emails from Anthropic — considered the "safety-first" AI lab — revealed they had trained models on vast libraries of copyrighted books and then shredded the evidence. Why? Because you have to get there before the other guys do. Especially if you believe you're the good guys.
This is the logic of every AI race: train on everything, create value well beyond what existed before, pay for damages later. Humanity has benefited from aspects of this. But it is not what Gandhi called Sarvodaya — the welfare of all, leaving no loser behind. It creates unintended consequences that society is left to clean up.
There's an old saying: off by an inch in the beginning, off by ten thousand miles by the end.
The process shapes the product. When the process is driven by speed and competition, it doesn't matter how good your intentions are at launch. The pattern will complete itself.
If you don’t want the same outcome, you can’t use the same incentives.
This is not a criticism of individuals. Many people building these systems are genuinely well-meaning. And blaming "the system" is convenient too — it lets us bypass our own contribution. We want systems to be so good that we don't have to be.
The issue is the process. Today's innovations emerge from extrinsically motivated action. Here's a thought experiment: if everyone working at Google had their basic needs met — food, shelter, clothing — and salaries disappeared, how many would still show up?
People hear this and think: utopian. It would be, if the argument were that extrinsic motivators don't work. They obviously do. But it's equally cynical to believe that intrinsic motivation cannot innovate. That's not realism. That's a failure of our collective imagination.
We have more faith in people being selfish than selfless. That's the crux of the matter. That's how we end up with this ism and that ism, desensitized corporations, and cascading unintended consequences. The ad-driven trajectory isn't a surprise. It's a process working exactly as designed.
Awakin AI starts from four premises. AI matters. Who we are when we build it matters. If we can come together purely with intrinsic motivations of service, it will not just transform us but we will innovate. And when we come together that way, our collective field of emergence will surprise us — in the direction of great compassion.
That last one is our cultural blind spot. But there's evidence.
For instance, the primary architect behind Awakin AI has put in thousands of hours of labor, unpaid. He has a day job to pay his bills. But he knows that without intrinsic motivation regenerating his heart, his life would be incomplete. Bereft of meaning. That's why I volunteer too. That's why all of us do.
And the arc of that motivation is taking us places. After a recent call with nonviolence scholar Michael Nagler, the call moderator himself wrote: "I decided to interrogate the Nagler bot with some outstanding questions, and I'm astounded by the depth and relevance of the answers. Each question led me to more resources — each of which will take some delightful explorations to savor and assimilate." He's now hours deep in a body of work he would never have found organically. Meanwhile, Raj Sisodia — who pioneered the Conscious Capitalism movement and has authored seventeen books — added, "I have not been this intellectually and morally stimulated in a long time."
That's one bot. The ecosystem keeps growing — each experiment born from the same intrinsic motivation, each one structurally impossible in a market-logic world:
Consuming broadcast content while building deepcast community. You watch a film together, then the real experience begins — reflection, connection, small groups. As the tagline reads: "Your film deserves more than views."
A woman recently emailed us saying she'd set a Guinness record for visiting 185 places of worship in a single month. "Experience of a lifetime," she wrote. So we invited her into a booth — and our AI turned that conversation into a meaningful story. Every person's lived experience, honored and amplified.
AI turned Nusrat Fateh Ali Khan's Urdu qawwali into karaoke with English lyrics — so you can receive the somatic vibes of the original song while following the deep meaning of its poetry. The body feels the devotion; the mind follows.
After a large event, AI helps create many-to-many smaller circles — matching people not by demographics or engagement scores, but by the deeper resonances that emerged during shared experience.
We bake constraints into governance and data practices because the point is not purity—it’s immunity to the pattern. Recently, we published how Awakin AI works and how Metta Circles holds your data. Read them side by side with any terms-of-service page in Silicon Valley. The difference is structural — because the process that produced them is structural.
This is not a Luddite position. We're not framing AI and its consequences as the evil forces. And it's not the tech sector's speed-at-all-costs, AI-for-utopia posture either. It's the middle way. Messy, ambiguous, unclear.
But it changes us.
As Dr. A.T. Ariyaratne once said in our living room: "We build the road, and the road builds us." Without the latter — without being built back — even AI's greatest promise delivers material abundance with no inner container to receive it.
When I tell people Awakin AI is non-commercial, most still roll their eyes. But maybe the eye-roll is exactly what's worth examining. Maybe the reflex to dismiss intrinsic motivation as naive is the blind spot that keeps the pattern repeating — three founders, three unkept promises, three betrayals, on and on, as if there were no other way to build.
There is.
— Nipun