A Machine Trained with Love: What does reciprocal AI look like?

To reimagine a slow and hopeful model of intelligence could mean asking AI to move like a swallow, adapt like slime mould or sense like a root system. Artist and activist Beccy McCray weaves together collective memory, environmental rhythms, and her own lineage to ask how machine learning might support - not replace - intuitive, interspecies ways of knowing.

Beccy McCray

What if artificial intelligence wasn’t designed for optimisation, efficiency or control - but for reciprocity, embedded knowledge, and collective movement?

This question emerged for me in the making of Intuition Maps - a participatory artwork grounded in the instincts of humans and more-than-human species. Rooted in Northamptonshire, and commissioned by Fermynwoods Contemporary Art, the artwork brought together local communities and native migratory species - chiffchaffs, swallows, painted lady butterflies, migrant hawker dragonflies, and more - to explore how capitalism and climate collapse are reshaping the intuitive rhythms that guide us. Together, we created more than maps, these were divinatory tools: drawings, dialogues, and speculative routes that traced instinct and imagination. 

I found myself turning to artificial intelligence - not for answers, but as a kind of collaborator. I started to see AI as more than a tool of control, but as an unpredictable and uncanny model. A new kind of crystal ball. A machine trained on signs and patterns, learning to ‘read’ the world. Increasingly, I believe AI is our latest act of divination - playing into our ancient urge to foresee, foretell, and make meaning out of uncertainty.

For me, this is deeply personal. My Romany heritage includes a cultural connection to fortune telling that stretches back generations - not as a novelty, but as a serious practice of intuition and survival. I increasingly work more intuitively, using art and ritual to tune into the world and read its signs. And I see parallels here with AI - particularly machine learning - as a process that searches for signals, finds patterns, and generates possibilities. At its best, it’s a collaboration between the known and the unknown.

I trained the machine with love - not just to perform tasks, but to listen. To become a participant in a wider ecology of intelligence.

So I began experimenting - first with prompting large models like ChatGPT, and then eventually with building a custom GPT for Intuition Maps, feeding it with local climate data, migratory routes, and responses from participants in my workshops. What emerged was poetic, strange, and surprisingly insightful. The AI began to offer speculative maps of migration, connecting species, stories, and seasons in new ways. I trained the machine with love - not just to perform tasks, but to listen. To become a participant in a wider ecology of intelligence.

But of course, this work unfolds inside a contradiction. The AI tools I used - OpenAI’s GPT-4 and others - are part of vast infrastructures shaped by venture capital and centralised control. These tools are energy-hungry, increasingly opaque, and often entangled with extractive industries. Many leading AI companies are actively selling optimisation tools to fossil fuel corporations. So when I ask: what if AI was about reciprocity?, I’m also asking: how do we reimagine these systems from the inside out?

This is where I’ve found hope in the idea of an Indigenised Internet, proposed by Jake Advincula, and in kin with James Bridle. It invites us to reimagine digital space as a decolonialsed, living system - one grounded in care, storytelling, decentralisation, and interdependence. A space that acknowledges the interconnectedness of everything. Within this vision, AI could function less like a controlling intelligence and more like a member of a distributed network. A mycelial participant in a shared, regenerative logic.

What if we trained our machines to emulate not human cognition, but more-than-human wisdom? There’s already real-world research pointing us in this direction. One example I often return to is the way scientists have used slime mould (Physarum polycephalum) to model complex systems. Despite being a single-celled organism, it can solve mazes, find optimal routes between points, and adapt to changing environments. Transport networks in Japan have been modelled using slime mould algorithms, suggesting that these biological systems offer highly efficient, resilient solutions - without central control. They think by being in relation.

In another example, researchers at MIT and the Santa Fe Institute are developing swarm intelligence models based on the coordinated movements of birds, fish, and insects. These decentralised systems - like starlings murmuring across a winter sky - show how group decision-making can emerge without leaders, using simple rules and constant feedback. AI researchers learn from these systems to design algorithms that don’t just optimise for outcomes, but adapt, respond, and move together.

We also see research emerging in bio-inspired robotics. Engineers are building systems that rely less on top-down control and more on distributed sensing. These machines are often more robust, more flexible, and more capable of navigating unpredictable environments. Could we imagine a neural network modelled not on the human brain, but on the root system of a tree?

Then there’s the field of environmental prediction and modelling, where AI is already being used to simulate and anticipate changes in ecosystems. From butterfly migrations to the flowering times of wild plants, these systems try to make sense of climate-driven shifts. But too often, these models are used to manage risk from a human-centred perspective. What if they were used to foster interspecies solidarity instead? To guide policy that prioritises shared futures?

Could we imagine a neural network modelled not on the human brain, but on the root system of a tree?

These examples show us what’s possible — but they also reveal a limitation. Much of AI research continues to privilege control, forecasting, and profitability. Even in ‘ethical AI’ circles, the conversation often revolves around risk mitigation rather than transformation. But when I look to the migratory paths of insects or the intuitive nest-building of tailorbirds, I see a different kind of intelligence. One based not in certainty, but in sensing.

Returning to Intuition Maps, this is the form of intelligence I tried to centre: instinctive, relational, embedded in the landscape. The maps we made were not GPS-accurate — they were hand-drawn, organic, speculative. They followed the imagined paths of butterflies, or the shared ideas of a community preparing for flood or fire. These maps became stories, rituals, experiments. They weren’t about knowing where we are going, but about tuning in to how it feels to move through the unknown together.

I often wonder, “What if the internet could dream with us?”. Because dreaming, too, is a form of pattern recognition. It’s a space where meanings cohere through intuition rather than logic. What if AI was trained not just to predict, but to actively hallucinate, and dream? Not just to produce outputs, but to tell stories?

The future of AI doesn’t have to be a dystopian vision of surveillance and control. Nor does it have to be a techno-utopia of limitless intelligence. It could be something quieter, queerer, more entangled. A web of machines that learn from moss, birdsong, and ancestral knowledge. A distributed system that values slowness, sensitivity, and connection. A technology that tends to us, that we tend to in return.

In building my own GPT, I didn’t aim for perfection. I aimed for possibility. I wanted to see if a machine could learn to collaborate across species. Could it hold both data and feeling? Could it make sense without flattening meaning? Could it be part of a ritual, rather than replacing it?

The results were messy, uncertain, and beautiful. The AI didn’t give me answers — instead, much like socially engaged art and guided facilitation, it prompted better questions. And maybe that’s the beginning of a reciprocal intelligence: one that doesn’t seek to predict the future — but helps to hone intuition, and feel our way there.

It could be something quieter, queerer, more entangled. A web of machines that learn from moss, birdsong, and ancestral knowledge.

Beccy McCray is an artist, researcher and activist. Explore more of Beccy's work, or follow her on LinkedIn or Instagram.

The Inheritance of Dreams: On dreaming freely in the age of AI

Patrick Dowd