1,934 operator principles distilled from four months
of corrections, citations, and shipped work. The 12 below are the
load-bearing ones — every other principle is a corollary or a
refinement.
Built by Nicholas FC, solo operator, Lagos.
1,934
principles
7,194
connections
4 mo
accumulated
23×/d
mining sessions
The 12 Load-Bearing Principles
01
Push, don't pull.
Insight delivery that depends on you remembering to ask is broken.
Build systems that fire when the condition is met — not when you check the dashboard.
The second hard day is when pull-based tooling fails.
02
Own the pipeline where it matters.
If you don't own the pipeline, someone else does. Not every layer
is worth owning — rent the commodity ones, own the seam where your judgment lives.
The chassis is rentable; the deployment + coaching is the moat.
03
Do it yourself first, three times, then make it a skill.
Skills are the unit of scaling automation. Premature abstraction
is wasted code; bespoke repetition that survives three rounds of corrections
is what becomes a reusable capability.
04
Wireframe in ASCII before you render in pixels.
Lightweight prototyping before high-fidelity development
prevents wasted effort. If the story isn't solid in ASCII, no amount of
polish will save it.
05
Wire behaviours, don't store them.
Persistence hierarchy: hook > rule > memory file.
Memory files are pull-based — they only fire if you remember to read them.
A behavioural constraint stored as a note is self-defeating.
06
External state beats in-head state, every time.
A principle held only in your head dies when context switches.
Bias every system design toward external state: files, lattices, scripts.
The graph isn't a nice-to-have — it's where the work survives.
07
Accumulated context IS the moat.
Not the code. Not the tools. The months of reasoning,
corrections, and confirmations encoded in the weight graph. A fresh
install can't replicate it; usage is the training data.
08
Smart AI writes the tool. Cheap AI runs it.
Use the expensive frontier model to design the abstraction
once. Use the local or cheap model to execute it ten thousand times.
This is the only way to scale agent work past hobby budgets.
09
Niche over broad, every time.
Tacit knowledge concentrates in narrow domains and is
extractable through documentation. Broad approaches yield platitudes;
niche approaches yield the kind of detail that becomes a skill.
10
Orchestrators orchestrate. Sub-agents specialise.
Don't let the general agent hold the hot coffee. The main
context coordinates and merges; the dispatched agents do the heavy reading,
the multi-pass reasoning, the writes. The rider doesn't run.
11
Run a loop against a definition of done.
Autonomous agents need a clear stop condition more than
they need a clever prompt. Fresh context each iteration, memory via git
and files, and a check at the top of every loop: are we done yet?
12
Step zero is when you stop needing to invoke it.
The highest signal that a system has reached maturity is
when its invocation becomes invisible — wired to the trigger, fired by the
hook, surfaced by the lattice. The transition from "tool" to "infrastructure"
is when you stop typing the command.
How the Graph is Built
Capture: voice notes drop into a Telegram channel; a
distillation pass turns them into compressed summaries within minutes.
Mine: a butcher loop chunks the day's transcripts and
extracts atomic claims — one principle, one decision, one correction
per row. Each atom gets a weight, an origin, a half-life, and a link
to its supersedes-history.
Connect: a connection pass walks the new atoms against
the existing graph and writes connects_to edges where
framings reinforce, refine, or contradict. The edges are queryable.
Decay + reinforce: atoms that get cited again gain
weight; atoms that go untouched for months decay. The top-weighted
slice — the "load-bearing" set — is what Ask Aloe reasons from.
Surface: hooks fire on every session start. The
lattice doesn't wait for you to remember it.
Want an Aloe trained on your team’s corpus?
This Aloe reasons from Nicholas’s 1,934 principles.
The same chassis runs against any operator's corpus — voice notes,
Slack, Notion, an existing wiki. The work is mining the corpus and
wiring the loop; the chassis is amortised.