v1.0.0 licenseMIT statuslive
0hero

a context-first operating system for go-to-market

Every function. Every metric. Every surface.
From first principles.

Substrate makes the context the first-class object. Goals you can score. Skills that refuse to ship on stale or missing context. A reconciliation loop that catches drift before it ships. The same operator, with Substrate, produces materially better work, and can prove it.

1what

The pain it fixes

Product, marketing, sales, and support each know different parts of the business: what we offer, why it matters, who it's for, and how we deliver it. None of it is first-class data. Put plainly: understanding of the market, the customer, our positioning, and how we serve our customer's jobs in the real-world is often siloed. Briefs live in docs. Positioning is theoretical. Messaging changes on a whim. Competitor research is fragmented. Battle cards rot in folders nobody reads. And the data lives somewhere else altogether. AI-assisted work makes it worse: outputs get faster, the floor stays put.

And underneath all of that, measurement, attribution, and calculated prioritization keep losing to internal politics. Whoever's most territorial about a decision usually wins it, even when the evidence says otherwise.

The thesis

The same person, with the same model, with shared context, produces materially better work than the same person without it. Faster, sharper, with citations that hold up. Most teams treat context as documentation. Substrate treats it as the load-bearing layer, alongside goals you can score and skills that refuse bad input. The AI is not the multiplier. The context the AI reads is. An AI-native system that raises the floor, so humans own the ceiling.

layer

Context

Ten layers per project. Positioning, ICP, voice-of-customer, competitive, product knowledge, conversion narrative, brand voice, market context, roadmap, strategy. Canonical. Cited. Freshness-windowed.

layer

Goals

Calibrated predictions with measurement contracts. Open with confidence; close with a Brier score. Authority follows accuracy, not the org chart.

layer

Skills

Gated CLI runtimes that refuse bad input before it ships. pre-publish-check, lp-ship, voice-enforce, audience-test, competitive-scout, claim-verify, +10 more.

2who

Who Substrate is for

functionwhat Substrate gives you
productA read on what the buyer panel actually says, not what marketing wishes they said. Roadmap claims under the same gate.
PMMCanonical positioning, ICP cut on real evidence, claim-verified narrative, calibrated launches.
contentVoice gate on every draft (kill-list, em-dash, throat-clearing), citation-aware claim checks, narrative coherence across pieces.
growthLP gate ladder (lp-ship), CRO test queue with measurement contracts, baseline + lift targets cited from analytics.
SEO/AEO/GEOWeekly aeo-tune watcher, schema.org delta, per-vertical AEO pass, off-domain citation audit. Built for AI-mediated buyers.
salesBattle cards that update themselves, displacement framing, gated voice on outbound, claim-verify on every promise.
successA single source of truth for "what we actually deliver", tied to the same positioning sales sold. No drift.
supportHelp-doc voice + claim coherence with the rest of the surface. The same kill-list. The same gate.
leadershipCalibrated bets with measurement contracts. Brier-scored predictions. Authority follows accuracy.
3how

The eight layers

#layerwherewhat
01contextclients/<client>/Ten layers per project. Canonical. Cited.
02skillsskills/Gated CLI runtimes. Refuse bad input.
03goalsgoals/ledger.mdFalsifiable predictions. Brier-scored.
04routinesroutines/Recurring workflows on a schedule.
05uxbin/substrate-statusOperator queue + dashboard.
06calibrationmetrics/Per-(operator, taste-type) Brier history.
07principlesPRINCIPLES.mdOperating rules. Slowest layer to change.
08reconciliationweeklyAlways-on freshness + link integrity.

Quick start

$ git clone https://github.com/k3sava/substrate.git
$ cd substrate
$ export PATH="$PWD/bin:$PATH"

$ substrate --list
$ substrate pre-publish-check <draft.md> --client <your-product>
4evolve

Self-evolution loop

Substrate is fed by a daily research-and-digest pipeline. The pipeline scans operator-relevant sources and produces dated digests with insights and sandbox-tested ideas. Each digest tags an "apply-to-substrate" section: which skills, principles, or knowledge layers should change.

Substrate's daily ingest reads those tagged sections and either auto-merges low-risk knowledge updates or files a proposal for the human-in-loop gate. The framework keeps pace with what's actually working in the field, without anyone having to remember to update it.

$ substrate-ingest-digests --since yesterday
[ingest] used ask-flash for 2026-05-07
[ingest] extracted: knowledge/learnings/2026-05-07.md
[ingest] processed 1 digest through 2026-05-07
5open

Why open source

So anyone can learn to build and implement AI-native systems for any business, and so we can all teach each other how to make them work better. Humans get to focus on creating real value.

6contact

Consulting

I'm available for consulting engagements if you'd like me to help your team set Substrate up for your context.

hello@iamkesava.com