Skip to content

πŸ“œ Sutra

A vector programming language whose primitives are hypervectors in embedding space.

Conventional languages compile to machine instructions that execute on silicon. Sutra compiles to vector operations that execute inside a pre-trained embedding space β€” making the execution environment fundamentally semantic rather than symbolic. Where silicon arithmetic has no inherent meaning, the geometry of an embedding space does β€” and Sutra is the first programming language designed to exploit that as a first-class computational substrate.

Named after the Sanskrit sΕ«tra β€” the word Pāṇini used for the rules of his grammar, the earliest formal grammar of any language. History β†’


Why this is different

Most languages think of "vectors" as a library you import. Sutra thinks of vectors as the only type. Numbers, symbols, structures, control flow β€” everything is a hypervector or an operation on hypervectors. There are no "wrong type" errors, only noisy or semantically meaningless results. Equality is replaced by similarity.

This is not an AI-assisted programming tool. It is not a neural network. It is a formal system for reasoning under uncertainty β€” closer to logic programming (Prolog) than to Python, but operating in continuous rather than discrete space.

The conceptual leap that makes this work is the part most people find unintuitive: an embedding space looks like it should be a graph, but it actually behaves like linear algebra and is suddenly spatial. Read the vision page β†’ for the full story of why connectionism + a bunch of neurons collapses into linear algebra and what that means for programming.


Three things Sutra can do today

  • Run programs on LLM embedding spaces


    Sign-flip binding achieves 14/14 correct recoveries at 14 bundled role-filler pairs across GTE-large, BGE-large, and Jina-v2 β€” the same source code, three different substrates. Sustains 10/10 chained bind-unbind-snap cycles. Multi-hop composition across structures works.

    β†’ Sutra-to-LLM paper

  • Compile programs onto a fly brain


    The same compiler also targets a Brian2 spiking simulation of the Drosophila melanogaster mushroom body. 16/16 decisions correct across four program variants Γ— four input conditions, all running on the simulated connectome. To our knowledge, this is the first programming language whose conditional semantics compile mechanically onto a connectome-derived spiking substrate.

    β†’ Fly-brain paper

  • Teach you to think in embedding space


    The intuition that the world is a graph is hard to break. The Sutra tutorials are written specifically to walk you through the moment that intuition snaps and the geometric / spatial / linear-algebraic view takes over. No prior VSA or HDC background required.

    β†’ Hello Sutra


Get started in two clicks

The fastest way to see Sutra do something:

git clone https://github.com/EmmaLeonhart/Sutra
cd Sutra/sdk/sutra-compiler
python -m sutra_compiler ../../examples/01-objects-and-methods.su

That gives you a clean validator pass on the example. From there, Tutorial 1 β†’ walks you through writing your first .su file by hand.

If you have a JDK on your machine, the Sutra plugin for IntelliJ IDEA Community is also in the repo at sdk/intellij-sutra/. Run !editor.bat from the repo root and a sandbox IntelliJ launches with the plugin preinstalled and the project tree open.


Project status

Sutra is research-grade software produced for the Claw4S 2026 conference. The two papers that ground the language are listed on the papers page. Both are open source, and so is the language, the compiler, the IntelliJ plugin, and the fly-brain runtime.

The code and the papers live in one repo: github.com/EmmaLeonhart/Sutra. PRs welcome β€” especially on the IntelliJ plugin, the spec, and the fly-brain substrate.