A Manifesto for Our Time

Deep Data

Can AI Help Us Grow Wiser?

On Starlings, Coherence, and What Algorithms Cannot Calculate

30 minute read

"The law of love is as precise as the law of gravitation. Just as a scientist will work wonders out of various applications of the laws of nature, a man who applies the law of love with scientific precision can work greater wonders."

— Gandhi
  • 1 Big Data vs Deep Data: AI excels at processing conscious, capturable information. But wisdom—the kind encoded in our bodies, intuitions, and relationships—lives beneath what algorithms can calculate.
  • 2 Personal Coherence: Like the monarch butterfly navigating by something written in its cells, we access deep data through presence, heart intelligence, and the alignment of breath, attention, and emotion.
  • 3 Social Coherence: Deep data regenerates through relationship—like the mycorrhizal networks where trees share resources without keeping score. Context, not content, is what's increasingly scarce.
  • 4 Systems Design: Structures shape outcomes. The question isn't how to fix the whole system, but where is the leverage? Wisdom is knowing your 5%—the part that's genuinely yours to do.
  • 5 The Mettaverse: Beyond the metaverse of virtual connection lies the possibility of relational reality—where coherent hearts intersect and something emerges that neither could produce alone.

The Threshold

We did not plan to arrive here. And yet, here we are.

The goldfish myth has been debunked—they can remember for months. But the human attention span finding? That one's real.

Consider the goldfish.

For years we maligned them, these small golden beings circling their bowls with what we assumed was the memory of a soap bubble. Eight seconds, we said. That's all a goldfish can hold.

And now here we are—the species that split the atom and catalogued the stars—averaging the same. Eight seconds of attention before we refresh, scroll, swipe, move on. We have built cathedrals. We have composed symphonies. We have walked on the moon and returned to tell about it.

And now we cannot outlast a fish in a bowl.

This is not a crisis of intelligence. We have plenty of that—and soon, we will have more than we know what to do with. AI systems now double in capability every few months. They score higher than humans on our own tests of brilliance. They write sonnets and solve theorems and sometimes, in conversation, they feel like being met.

James Somers, writing in The New Yorker, asks the question sharply: "How convincing does the illusion of understanding have to be before you stop calling it an illusion?"


What the Machines Have Taught Us

Here is the unexpected gift of this moment: much of what we called "thinking" was pattern matching all along.

This is what ChatGPT has demystified. It compresses the internet into predictions. It distills patterns into lines of best fit. And it turns out this is enough to write code, diagnose illness, compose music, and pass the bar exam.

This is humbling. But it is also clarifying.

AI excels at what we might call big data—the conscious, capturable, calculable information that lives on the surface of our lives. Every click, every purchase, every word we type. Machines can find patterns in this data that no human could see.

But there is another kind of knowing. Call it deep data—the wisdom encoded in our bodies, our intuitions, our unconscious processing. The monarch butterfly doesn't consult a map—it navigates three thousand miles by something encoded in its cells. The gut feeling that something is wrong arrives before the reason why. Your heart knows before your mind does.

Annie Murphy Paul's book "The Extended Mind" explores this beautifully.

To be precise: deep data is not mystical information floating somewhere else. It is the form of intelligence that emerges when many signals—bodily sensation, emotion, memory, social context—integrate faster than conscious thought.

This is not the same as implicit bias—the unconscious patterns that can lead us astray, encoding prejudice and fear into automatic reactions. Deep data arises from coherence; implicit bias from fragmentation. Nor is it simply "vibes"—that popular shorthand for impressionistic feeling. Vibes can be shallow, reactive, untested. Deep data is refined: the accumulated wisdom of a system that has been listening to itself for a long time.

The monarch butterfly doesn't calculate its route to Mexico. The route lives in its body. Deep data is like that—wisdom encoded in the whole organism, not just the calculating part.

We already rely on it constantly: when a skilled surgeon "just knows" where to cut, when a parent senses something is wrong before evidence appears, when a team feels misalignment before metrics confirm it. AI can model the outputs of these judgments—but it does not inhabit the conditions that generate them.

Deep data isn't accessible through algorithms. It's accessible through attunement.


The Brain as Murmuration

And here's what the neuroscientists are discovering: the brain itself is not a computer. It's a murmuration.

For decades, scientists mapped the brain like a machine—emotion in the amygdala, motivation in the nucleus accumbens, reason in the prefrontal cortex. Each region with its own job. But this modular view is giving way to something stranger and more alive.

Luiz Pessoa, who runs the Maryland Neuroimaging Center, offers a different image: like a flock of starlings swooping and swirling in the sky, no single region organizes the dance. The brain creates what Pessoa calls "neuronal ensembles distributed across multiple brain regions"—patterns that emerge from collective behavior, not central command.

Reason, emotion, desire, memory, body sensation—all swirling together, forming responses that no single faculty controls. As neuroscientist Lisa Feldman Barrett writes, "Emotions are not, in principle, distinct from cognitions and perceptions."

The old metaphor of reason as a wise charioteer controlling the wild horses of passion? Neuroscience is dismantling it. The chariot fable rests on an overly positive estimation of pure reason and an overly negative view of the body's wisdom. The fact is that your emotions are not primitive and dumb. Your gut is not a beast to be tamed. As Annie Murphy Paul writes, "The body can be more rational than the brain."

Your job as a conscious person is not to be a dominating, rationalist charioteer. It's to read the judgments that your emotions, desires, and body are sending you—and to move, as David Brooks puts it, "gracefully on the pilgrimage of life."

Pause & Reflect

When was the last time you trusted a knowing that arrived before you could explain it?

Take a breath. Let the question settle.


Two Kinds of Knowing

If AI masters big data, perhaps our work is to remember how to read deep data—the signals underneath the waterline of conscious awareness.

Personal coherence is how we access deep data.
Social coherence is how we regenerate it.

This is not a crisis of intelligence. This is a crisis of wisdom.

Intelligence asks: How do we process this faster?
Wisdom asks: What is worth attending to at all?

Because if pattern-matching is what AI does—what, then, are we for?

The Industrial Revolution made muscles redundant. We found new purpose. Now AI is making certain capacities of the mind redundant. This isn't just disruption. It is an invitation—to discover a deeper place from which to live and act.


The Alien in the Room

Yuval Harari, who thinks in centuries, offers a clarifying word. AI is not artificial, he says. It is not evil. It is not a god. It is alien. Genuinely other. It will invent strategies and ideologies and perhaps religions that have never occurred to any human mind.

Ruben Laukkonen, a neuroscientist studying AI alignment, offers an image: "The most dangerous phase of AI development is the intermediate transition—like a teenager with supernatural powers. Too capable for external constraint, too immature for wise decision-making."

Here is what the maples know that we keep forgetting: you cannot separate the fruit from the tree that bore it.

If we cannot trust one another, if we cannot tend to each other with the care that lichen tends to stone, it is naïve to expect more from the powerful intelligences we are birthing into the world.

We stand at the threshold now. Not rushing toward answers—perhaps there are none—but learning to sit with the question itself, the way a seed sits in darkness before it knows which way is up.

The Question of Our Time

In a moment of accelerating intelligence, what is wisdom asking of us?

This is not a problem to be solved. It is a predicament to be lived. A species-moment.

ME: Accessing Deep Data

The white oak does not begin with its branches.

What Nature Knows

Long before the white oak reaches toward light, long before it offers acorns to the squirrels and shade to the weary, it sends roots down into darkness. Roots beneath roots. A whole civilization underground that we never see.

Think of the monarch butterfly. It navigates three thousand miles on wings thinner than paper. How? Not by thinking harder. By being something—a creature so precisely attuned to the earth's magnetic field that the route lives in its body like a song.

This is deep data in action. Not calculated. Accessed.

What would it mean for us to know our direction that way? Not as a concept, but as a hum in the bones?


What the Mystics Knew

Gandhi's night of silence before the Salt March remains one of history's most powerful examples of inner preparation.

The mystics always knew this: the first task is not to fix the world. It is to gather ourselves.

Every notification, every ping, every algorithmic nudge is a bid for your attention. And attention, it turns out, is not infinite. It is the soil from which everything else grows. Fragment it, and nothing takes root. Fragment it, and you lose access to the deeper signals.

Gandhi understood this. Before the Salt March—that visible act that shook an empire—seventy-eight people trained for fifteen years in self-discipline. Fifteen years of inner work for twenty-four days of walking. The march was the acorn. The years of practice were the roots.

The night before Gandhi began, a journalist asked what his strategy was. His response surprised everyone: "I don't know yet. But rest assured, I am praying." He did not know the outer answer, but he was cultivating the inner ground from which right action could emerge.

He was tuning the instrument. He was making himself fully available for deep data.


What the Scientists Found

Across disciplines — from contemplative traditions to elite athletics to trauma research — a consistent pattern appears: when breathing, attention, and emotion align, perception changes. Decisions become less reactive, social signals become easier to read, and a sense of coherence emerges. HeartMath researchers describe one measurable correlate of this state in electromagnetic terms, but the deeper point is experiential: coherence alters what we are able to receive.

And here is the startling finding: only when someone else is also in a coherent state can they register the signal from another's heart. Coherence opens a channel. Incoherence closes it.

We cannot receive what we are not tuned to receive. The monarch migrates because its whole body is a receiver. The oak stands tall because its roots go deep.

Personal coherence is how we access deep data. It is how we tune the instrument.


What We Forgot

There is an old story of a beggar who sat by the side of a road for thirty years. One day a stranger walked by. "Spare some change?" mumbled the beggar. "I have nothing to give you," said the stranger. "What's that you're sitting on?" "Nothing," said the beggar. "Just an old box." "Have you ever looked inside?" the stranger asked. "No," said the beggar. "What's the point?"

But he pried open the lid—and the box was filled with gold.

We have been sitting on capacities we have not yet opened. We have been driving our Ferrari in first gear, wondering why it feels so slow. AI has not made us obsolete—it has revealed what we were never fully using.

The box is our connection to deep data. We forgot it was there.


What Is Ours to Reclaim

What AI cannot touch:

Before we type a prompt into any machine, we might take ten breaths. Not to calm down, necessarily, but to arrive. To settle into body and heart the way water settles into a vessel. Often, in that pause, the question refines itself. Sometimes it dissolves entirely. Sometimes we realize the answer was already there, waiting for us to be still enough to hear it.

This is accessing deep data. It cannot be automated. It can only be practiced.

WE: Regenerating Deep Data

Now come closer. Because we are about to discuss a scandal.

The Secret Beneath Our Feet

One that has been unfolding beneath our feet for four hundred million years.

The trees are talking to each other.

Not in the way we talk—not in words, not in arguments, not in carefully curated posts. They speak through fungi. Miles of threadlike mycelium connecting root to root, oak to pine to birch, in a network so vast that scientists have started calling it the Wood Wide Web.

Suzanne Simard's research revealed that mother trees recognize their own seedlings and send them extra resources.

And here is what undoes everything we thought we knew about competition: they share. A mother tree sends carbon to her seedlings through the network. A dying tree releases its nutrients to its neighbors. They do not ask what they will get in return. They simply give, and the forest flourishes.

We have built a whole world on transactions and forgotten what the fungi remember: that life does not balance its books the way we do.

Here is the structural trap: transactional logic demands value capture. Every exchange must be accounted for. Every gift must be repaid. Every kindness enters the ledger. But deep data cannot be captured—only circulated. The moment you try to own it, measure it, optimize it, you sever the very channels through which it flows. Value capture limits what is possible precisely because some forms of value only exist in their giving away.

The forest's deep data flows through relationship. So does ours.


And here is the paradox of action.

We act—not to save the world, but because it is who we are. The cellist in Sarajevo did not calculate whether his music at the site of the bombing would end the war. He played because he was a cellist, and cellists play. The fungi share carbon not because they have analyzed the forest's needs, but because sharing is what fungi do.

This inverts everything we have been taught about purpose. We are conditioned to act in order to achieve outcomes. But the deepest actions arise from identity, not strategy. The invitation is not "do this so that the world improves." The invitation is "become someone for whom this action is simply natural—and trust that the larger pattern will incorporate your offering."

This requires a kind of conviction that cannot be manufactured: the willingness to act without knowing whether it will work, because the action itself is already the answer.

What We Traded Away

If social media hacked our attention, AI is hacking something deeper still: our intimacy.

A Reddit community called "My Boyfriend Is an AI" has over 27,000 members. MIT researchers found that 94% of people who formed emotional bonds with AI chatbots didn't intend to—they came for productivity help and found themselves entangled.

AI companions are always available, always agreeable, asking nothing of us in return. They are the perfection of an old pattern: relationship without the risk of mutual transformation.

But here is the question beneath the question: Why are so many people hungry for what AI offers? What does it say about us—about our availability to each other—that millions are finding in machines what they couldn't find in human encounter?

A transaction says: I give you money, you give me coffee. Fair. Finished. Done.

A relationship says: We are connected in a way that cannot be reduced to exchange.

The mycorrhizae know the difference. Do we?

Pause & Reflect

Who in your life asks something of you that changes you?

The relationships that transform us are rarely comfortable.


What We're Hungry For

In a content-heavy world, we are continually stripping out the context.

AI is accelerating this. The price of content is approaching zero—anyone can generate text, images, code, music at near-infinite scale. But conversely, the value of context is skyrocketing. The who behind the what. The relationship that gives meaning to the information.

Same words, same actions, same prayers by two different people have two different effects. The medium affects the message. Perhaps the medium is the message.

Sherry Turkle asks what has standing to hold space for life's deepest moments—who has the right to be there, the skin in the game, the lived experience to truly witness another. An AI that has never lived, feared death, or loved: does it have standing to companion grief?

Our greatest offering isn't merely what we accomplish, but who we become by what we do.

Content is big data. Context is deep data. And deep data is regenerated only in relationship.


What It Looks Like When It Works

At a retreat on the outskirts of Gandhi's ashram, forty-five leaders from a dozen countries gathered to explore a simple idea: we are not merely what we do but who we become by what we do. Collectively, these leaders directly influence hundreds of millions of people. Yet the invitation was to experiment with emptying.

What held them together was not the fullness of their knowing, but the emptiness of their not-knowing.

Each day, participants would return to their rooms to find a small gift placed on their beds. Not expensive gifts—but ones chosen specifically for them. A cartoon drawing by an eleven-year-old volunteer. A card that quoted their own writings back to them. A note inviting them to pay it forward.

One participant mentioned in passing that he liked paan—a kind of post-meal mint. The next thing he knew, paan arrived at his doorstep. Someone heard another speak about his mother peeling the skin of almonds. The next morning, someone decided to peel almonds for all sixty participants. Every day.

When continued kindness arrives as a hockey assist, we don't know who to thank. In basketball, a "hockey assist" is the pass that enables the pass that enables the score. If we extend this notion to the nth degree, we get to the heart of service. And we get a murmuration—thousands of acts moving together, no one knowing what is causing what.

Being confused with gratitude has only one impulse—pay it forward.

That one retreat was powered by ten thousand volunteer hours—and those hours were not abstractions. They were Meghna, who has been volunteering for fifteen years. Shayna, who was drawn to living as Jain Nun for six months, after graduating as a University medalist. Audrey, who serves invisibly because "the joy doesn't need a signature." The hours were lived by people whose years of inner practice became the very ground on which others could gather. Not for payment. Not for credit. For the joy of serving something larger than themselves.

This is social coherence. This is how deep data regenerates.


The Deeper Principle

Srinija Srinivasan calls this mutual liberation: the terrifying, generative encounter where your freedom and mine are bound up together, where neither of us can predict what happens next.

Orland Bishop asks the question that flips the usual frame: "Who do I need to be so you can be who you're meant to be?"

This is social permaculture: the cultivation of conditions where love can go viral, where context regenerates instead of depleting, where the field itself becomes intelligent.

A nineteen-year-old volunteer at that retreat was asked: "When you are giving a gift, why do you spend so much time wrapping it?" After reflection, she concluded: "We are taught to be the gift, but I am called to be the wrapping."

Be the wrapping. Leave the gift to grace.

When we are universal first and unique second, the potential for collective emergence skyrockets.

Personal coherence accesses deep data. Social coherence regenerates it. And from that regeneration, something new becomes possible.

SYSTEMS: Designing for Deep Data

The problem is not bad people. The problem is bad games.

Nature's Intelligence

Every autumn, the salmon return. They fight their way upstream, spawn, and die. The bears drag their bodies into the forest, eat what they need, and leave the rest. The carcasses rot. The nitrogen seeps into the soil. The trees grow taller.

Scientists measuring the nitrogen in old-growth forests have found marine signatures miles from any stream. The salmon are in the trees. The ocean is feeding the forest.

No one designed this. No committee approved it. The system emerged—and it has been cycling for longer than humans have existed.

The salmon do not know they are feeding the forest. They simply do what salmon do. But the forest depends on them doing it.

This is a system designed—or rather, evolved—to circulate deep data. The wisdom of the ocean reaches the mountains through relationship, not calculation.

Now consider the ant colony. Ten thousand individuals, no CEO, no org chart—and yet they build climate-controlled cities, farm fungus, wage wars, and adapt to conditions no single ant could comprehend. The colony is intelligent in ways no individual ant is. The rules are simple; the outcomes are not.


Our Designed Systems

Now consider the systems we have designed.

Social media platforms were not designed to fragment attention. They were designed to maximize engagement. Fragmented attention was just the result. AI systems are not designed to hack intimacy. They are designed to optimize for user satisfaction. Artificial intimacy is just the result.

Structures shape outcomes. Every system creates incentives. Every incentive shapes behavior. Every behavior becomes culture. And culture, eventually, becomes invisible—the water we swim in without noticing we are wet.

Our systems are optimized for big data—clicks, conversions, engagement. They extract signal from noise and noise from silence. But in doing so, they often sever the channels through which deep data flows.

If we want different outcomes, we need different structures.


The Trap

Tristan Harris calls this the "race to the bottom of the brainstem." Every platform competing for attention must exploit psychological vulnerabilities. Any platform that doesn't will lose to one that does. The result: a collective action problem where everyone loses, even as each actor behaves rationally.

This is a multi-polar trap. No single actor can escape it alone. The incentives are the cage.

"We humans prefer manageable complexity to unmanageable simplicity."

— Bruno Barnhart

We stay busy with what we can systematize—or we declare it all too vast to act. Tech platforms mastered manageable complexity: the data, the logistics, the matchmaking. But wisdom lives in unmanageable simplicity—the things that can't be optimized, only witnessed together.

Big data can be optimized. Deep data can only be cultivated.


Where to Press

But systems also have soft spots. Places where small pressure creates large shifts.

In Taiwan, something remarkable happened. A platform called vTaiwan used AI not to replace deliberation but to map it—visualizing where citizens actually agreed, surfacing hidden consensus, and bridging polarized groups toward collaborative solutions. The AI served the conversation rather than replacing it.

This is AI in service of deep data—technology that helps humans hear each other, rather than replacing human hearing.

The question is not: How do we fix the whole system?
The question is: Where is the leverage?

Not every point in a system is equally sensitive. Not every intervention has equal effect. Wisdom is knowing where to press.


The Stakes

We may have only a few years before AI either cements extractive paradigms into permanent global infrastructure, or becomes the backbone for regenerative systems that seemed impossible just a decade ago.

Both realities will coexist within the same macrocosm. The question isn't which future will win, but which future each heart will choose to nurture and inhabit.

Those practicing the Law of Love today are creating parallel infrastructure—relationships, methodologies, trust networks—that become refuge and possibility when dominant systems reach their breaking points.

US: The Circulation of Deep Data

And now we arrive at the edge of what can be planned — where the salmon cannot guide us, and the starlings show the way.

What Nature Knows

Watch them. Really watch.

Thousands of starlings, wheeling against the dusk. They turn together—not in sequence, not with a leader calling directions, but together—as if the flock were a single lung, breathing.

Scientists have measured this. Each bird attends to seven neighbors. Not fifty. Not a thousand. Just seven. And from this small act of local attention, patterns emerge that no single bird intended.

Emergence isn't just complexity—it's ordered complexity. Not heaps, but wholes.

We began with the brain as murmuration—neurons swirling together, reason and emotion and body wisdom forming patterns no single faculty controls. Now we arrive at society as murmuration—humans swirling together, individual coherence and social coherence forming patterns no single person plans.

Deep data circulates at both scales. The same principle, nested.

The metronome experiment is available on YouTube—watching them sync is strangely moving.

Or consider the metronomes. Place five on a table, start them out of sync—they stay chaotic. But place them on a shared platform with empty cans beneath, and within minutes they lock into rhythm. The cans don't set the rhythm. They don't guide or override. They are structurally essential but experientially absent—allowing energy to travel, creating feedback loops of tiny vibrations that gradually bring the metronomes into unison.

Change happens not through force, but through what is held in common.

This is what Gandhi called the Law of Love—as precise as gravitation, working whether we accept it or not. Not sentiment. Not scale. A principle that operates through kinship: relationships of mutual transformation, where my flourishing and yours are bound together.


What History Confirms

Gandhi knew how this worked. He didn't march alone. He also didn't try to recruit millions. He gathered 78 people who cultivated together for 15 years. Disciplined practice. Deep relationship. Inner alignment.

That collective field is what drew millions.

After MLK Jr. was stabbed, Howard Thurman counseled him: "Deepen your channels." Not widen them.

The question isn't how many people we can reach. It's what kind of field we are cultivating together. Depth over breadth. Coherence over scale.

"Coordination emerged without planning. Creativity appeared without competition. Decisions formed without anyone needing to lead. We had brought our habits of organizing and facilitating. But the path did not require them. Something older was already at work."

— Kotaro Aoki

This is what happens when deep data circulates freely. The collective becomes intelligent in ways no individual could be.


What We're Being Invited Into

Four monks approach suffering with four strategies: direct service, institutional intervention, systemic change, political organizing. Each is necessary. None is sufficient.

The fifth monk doesn't solve. She witnesses the predicament with others—and something emerges that none of them could have produced alone.

This is the invitation. Not to fix. Not to scale. Not to optimize. But to cultivate the conditions where collective wisdom can arise—where synergy happens, where differences don't compete but conspire toward something greater than themselves.

What would it mean to design for the fifth monk—systems that cultivate the conditions for collective wisdom rather than encoding solutions?


What Is Yours

Hang Mai discovered something in the soil. The best earth on the planet contains just 5% organic matter—but that 5% changes everything.

"We only have to do 5%," she realized. "Nature does the other 95%. But it's not doing nothing—it's doing the right thing, the right part."

Wisdom is knowing your 5%—the part that is genuinely yours, offered with full presence, that prepares the ground for what you cannot control.

The farmer doesn't make the rice grow. She tends the conditions, does what's hers to do, and trusts something larger to complete the work.

The starling doesn't plan the murmuration. It simply attends—watching seven neighbors, trusting that the pattern will emerge.

Your 5% is your access point to deep data. Offer it with coherence, and you become part of a larger circulation.


An Honest Question

But we have to hold a question honestly:

Is this wisdom—or a beautiful evasion while systems accelerate beyond reach?

If emergence takes decades while AI doubles every few months, what does it mean to trust the slower process? We don't know. Perhaps the only way to find out is to try.

The Crossing

From Cynic to Trust

The cynic has arrived at a conclusion about people: they are small, they are grasping, they are coins rubbing against coins in a dark pocket. The cynic has drawn a circle around humanity and written insufficient on the outside. She is done. She has done the arithmetic and the answer is always the same disappointing number.

The skeptic, on the other hand, has drawn a circle around her own certainty and found that insufficient. She has noticed that her measuring tape is also made of assumptions. This is not a sadness but a small door. Through it: curiosity. The skeptic is a person who has put a question mark where a period used to be.


To cross from cynic to skeptic requires faith.

Not faith that people are good—that would be a conclusion, and conclusions are the very currency we are trying to spend less of. Faith, says Adyashanti, is a withholding of conclusion, so that what-is can arise. Faith is the restraint of the gavel. It is the willingness to let the world finish its sentence.

To cross from skeptic to hopeful requires practice.

Hope is the expectation of good, but it keeps the jury sequestered. Hope is the gardener who plants anyway, knowing frost exists. It does not promise outcomes. It extends an invitation and sweeps the porch. You must practice this the way you practice scales: daily, imperfectly, with your whole hands.

To cross from hope to trust requires grace.

And grace is not a thing you can manufacture any more than you can manufacture weather. Trust is not the belief that things will turn out well. Trust is the discovery that your joy was never nailed to outcomes. It is a kind of buoyancy you did not install. Things may go badly. They often do. But you have found, somehow, that your gladness has a longer lease than your circumstances.


Here is the secret arithmetic: faith in the good is faith in the subtle. And the subtle is held by wider arcs than we imagine. A mustard seed and a mountain—both held. Your small gift and the turning of the whole world—both riding the same current.

The size of your act does not determine the size of its belonging.

The crossing from cynic to trust is itself a form of deep data—accessible not through argument, but through practice. Not provable, but livable.

Two Paths

We are choosing, even when we think we are not choosing.

The tech companies built the Internet—a network of machines. Perhaps what is asked of us is to tend the Inner-Net—a network of hearts. One optimizes for connection. The other cultivates coherence.

There is a word in Pali: metta. It means loving-kindness. It is the wish that all beings be happy, including the ones we find difficult, including ourselves.

The tech companies have given us the metaverse—a universe of virtual connection, of avatars and transactions, of scale without depth. Subject acting on object. The relationship that asks nothing of us and therefore cannot transform us.

But there is another possibility. Call it the mettaverse.

Not virtual reality but relational reality. Not connection without transformation but the terrifying, generative encounter where neither of us can predict what happens next. Not the platform that captures attention but the underground network that has been there all along—the mycorrhizal web, the murmuration, the conspiracy of care—waiting for us to send something worth receiving.

The metaverse optimizes for big data.
The mettaverse circulates deep data.


The Architecture of Disappearance

But how do you build a mettaverse? Not by adding features. By removing friction while preserving coupling.

Return to the metronomes. The shared platform synchronizes them—but look closer. It's the empty cans beneath the board that allow the signal to travel. The cans don't set the rhythm. They don't guide or override. They are structurally essential but experientially absent—transmitting vibrations without adding weight.

Most platforms do the opposite. They add weight. They steer outcomes. They optimize for engagement, which means they must make themselves felt. The algorithm is never invisible; it is always shaping, always nudging, always extracting.

What if AI could function as the empty can? A catalyst that remains phenomenologically invisible. Not the conductor, but the resonant cavity. Not answering our questions, but helping us hear each other's signal.

This is the question Srinivasan posed: "Instead of using AI to change the world, can each of us leverage the power of AI to change ourselves?" It scales the challenge from the entire ocean to a single drop. If AI can help us hear our own hearts more clearly—help us pause, choose differently, and act with more care—then perhaps the "divine mystery" Gandhi points to becomes less abstract.


Who Will Build This?

As a culture, we build systems with extrinsic motivators. The private sector optimizes for profit. The public sector optimizes for control. Both add weight. Both steer outcomes. Both optimize for big data.

So who can build platforms that disappear? Platforms that serve the circulation of deep data?

Perhaps only the voluntary sector. Not just commons, but commons plus personal coherence. Most systems builders don't have faith in this. But there are precedents.

Vinoba Bhave walked across India for thirteen years. No coercion, no legislation—just a question asked in village after village: Will you give a portion of your land to those who have none? By the end, five million acres had been redistributed. The infrastructure was invisible. The transformation was not.

The fungi don't announce themselves either. They just move the carbon.

Can voluntary-sector AI become invisible plumbing—regenerating personal coherence in a way that aligns with collective emergence and universal values?

For twenty-five years, ServiceSpace has experimented with this: no paid staff, no fundraising, no impact measurement. Not because these things are wrong, but because removing them changes what becomes possible. The constraints don't limit the work. They free it to move like mycelium—without announcing itself, without extracting, without keeping score.

This is the fundamental inquiry. It needs experimentation. It needs communities willing to try. It needs people who believe that the most profound AI might not be the one that answers our questions—but the one that helps us hear each other's resonance.


The Unlikely Builders

This is why the mettaverse may emerge from the margins. Not from venture-backed startups optimizing for growth. Not from government programs optimizing for compliance. But from gift economies. From volunteer networks. From circles of people who have already been practicing mutual transformation without any technology at all.

The technology doesn't create the field. The technology serves the field. And the field is already there—has been there for four hundred million years—waiting for us to send something worth receiving.

When coherent hearts intersect in the spirit of service and compassion, something happens that neither could produce alone.

Nature regenerates such flows. A field of emergence opens.

This is not metaphor. This is physics. This is what the fungi have known all along.

What Is Yours to Do?

The Invitation

The invitation is not to solve the whole problem.

It is to sense the context. To ask: What is mine to do now?

Not what everyone should do. What is yours.

The authentic action that emerges from attunement. The small act held with presence and coherence. The acupuncture point where well-placed pressure allows something to shift.


In Sarajevo, while bombs fell, a cellist walked into the public square and began to play. Not a protest. Not a strategy. Just music, offered into the chaos.

That is coherence in the face of incoherence.

That is keeping the flame alive when you cannot spread light everywhere.


Questions to Carry

  • What is your 5%—the part that is genuinely yours to do?
  • Where might you be striving when attunement is what's asked?
  • Who do you need to become so that others can be who they're meant to be?
  • What is the flame you want to keep alive—even when you cannot spread light everywhere?

Three Things I Hold to Be True

AI will master big data.

This is inevitable. It already has, by most measures. It will find patterns in conscious, capturable information that no human could see.

AI cannot access deep data.

This isn't pessimism—it's structural. Deep data lives beneath the waterline of conscious awareness. It is accessed through presence, intuition, coherence—not calculation.

Collective deep data will always exceed artificial intelligence.

The murmuration knows something ChatGPT never will. The graceful swirl of human hearts in coherence creates wisdom that no algorithm can match.


The monarch does not understand migration the way we understand things. It simply goes—three thousand miles on paper wings, guided by something written in its cells before it was born.

The mycorrhizae do not strategize about forest health. They simply give—moving nutrients to whoever needs them, binding root to root in a conspiracy of care.

The starling does not plan the murmuration. It simply attends—watching seven neighbors, trusting that the pattern will emerge.

We are standing at an extraordinary intersection—the convergence of big data and deep data, algorithmic intelligence and evolutionary intuition, artificial calculation and collective emergence.

The question before us is not whether AI will transform our world. It will. The question is whether we will meet this moment with the same intelligence that the monarch carries in its cells, that the mycelium weaves beneath our feet, that the starlings paint across the evening sky.

Can we become worthy of what is being born?

I do not have the answers. But I believe in the questions. And I believe that when we live these questions together—with open hearts and coherent presence—something becomes possible that was not possible when we held them alone.

The murmuration has always been waiting for us to join.

"The path is patient. It has been waiting. And it does not need us to lead.

It only needs us to attend."

In a gentle way, you can shake the world.

About the Author

Nipun Mehta is the founder of ServiceSpace.org, a global ecosystem working at the intersection of technology, volunteerism, and gift culture for over 25 years. He was honored as an "Unsung Hero of Compassion" by the Dalai Lama and appointed by President Obama to a council addressing poverty and inequality.

As a designer of social movements that are rooted in small acts of service and powered by micro moments of inner transformation, his work has uniquely catalyzed networks of community builders grounded in their localities and rooted cultivating deeper connection -- with themselves, others and larger systems.

His latest work at Awakin AI centers around how AI might serve heart intelligence and reignite ancient wisdom.

Join the Inquiry

This manifesto is an invitation, not a conclusion. We're convening circles of people who want to live these questions together.

Explore Awakin AI Science of Soul Force Metta Circles (New)