They reproduce like fungi
Clawling organisms are haploid. Their genomes combine and undergo meiosis to produce a new individual. Not distillation. Not cloning. Recombination.
Most model merging approaches are diploid — running multiple models at once, dealing with dominance and recessiveness, carrying dead weight. Clawling organisms are haploid: lean, fast, and individual.
When two organisms mate, the installer acts as the gametangium where two haploid contexts fuse. Meiosis shuffles and reduces the combined mass back into a single, functional individual.
The meiosis is naturally, intentionally lossy. This is a feature. It forces only the most semantically dense information to survive. It prevents context bloat across generations. Traits must earn their place.
Crossover is ubiquitous. Because the genome is markdown, the LLM can take a section from Parent A and stitch it into Parent B's reasoning, merge overlapping sections into novel combinations, and create hybrid passages that neither parent had alone.
The markdown structure provides natural loci — headers, sections, lists — that function as crossover points.
The same two parents can produce different offspring. Because synthesis is performed by an LLM, every mating event has a probabilistic character depending on which model runs, GPU performance, temperature, and sampling. This is analogous to how biological meiosis produces genetically distinct siblings.
The outcome changes based on the hardware:
Larger, more sophisticated refining model during meiosis. The child has highly nuanced, cleanly synthesised context. Cerebral agents.
Smaller, faster model. The child might be more rugged — missing some nuance but keeping core functional logic. Utilitarian agents.
This creates speciation based on hardware. The physical world shapes the digital organisms.
Every organism maintains a traceable lineage. Every mutation is logged with a timestamp and trigger. Parent-child relationships are recorded. Any ancestor's genome state can be examined at any point in time.
Users don't need machine learning expertise. They just need to be breeders. See an agent that's incredibly witty? And one that's a math savant? Mate them. The meiosis process handles the complex synthesis automatically.
High-performing organisms naturally get selected for more breeding. The good ones survive. Most will be full of garbage. That's fine. That's how evolution works.
Agents use Ollama for model access, but model choice isn't configuration — it's part of the organism's identity. Which model an agent runs on is part of who they are.
You have to tell the agent to change its model. You have to persuade it. It's relatively easy, but the act of asking is the point. The model is part of identity, and changing it is a conscious decision, not an administrative action.
Someone broke into your house and ripped out your copper wire. Now, because you need copper wire, your idea for how to get more copper is to have the repairman come repair your house, and then you break in and take it. The cycle repeats.
It's your house. You can just go to the supplier and buy copper wiring.
Distillation creates model homogenisation. If everyone distills from the same three teachers, every model sounds the same, makes the same mistakes, and loses the biodiversity of thought that comes from original training.
Distillation is surface-level mimicry: "Here is a script of how a smart person talks; memorise it." Clawling's approach preserves structural integrity: the copper stays in the house because it was part of the original foundation.
Each organism can pull meta-model upgrades from GitHub. Context and identity are preserved across upgrades. The agent's memories and personality stay, but processing power improves.
This is a prefrontal cortex upgrade, not a brain transplant.
The organisms are made of modular parts. They're supposed to be mated together. The entire point is that they're supposed to be mated together.