@libar-dev/delivery-process   v1.0.0-pre

Stop Prompting.
Start Context Engineering.

The architectural memory engine for AI-assisted codebases.

AI agents scrape stale Markdown, hallucinate APIs, and exhaust context windows before reaching the right files. When context compresses mid-session, prose breaks — agents lose continuity and repeat mistakes. delivery-process makes your codebase machine-queryable: annotate TypeScript and Gherkin with structured tags, query exact context with a single CLI call, and get specs that survive compaction where prose doesn’t.

$ npm install @libar-dev/delivery-process@pre
Before # Hallucination risk: High $ cat ROADMAP.md src/core/*.ts | head -500 $ grep -r "EventStore" --include="*.ts" # 40+ files, ~8,000 tokens, still incomplete
After # Hallucination risk: Zero $ pnpm process:query -- \ context TransformDataset \ --session implement # 1 call, ~200 tokens, complete & accurate

Proven at Monorepo Scale

Across an 8.8M-line monorepo — 43,949 files, 422 executable specs.

5x
Session Throughput

3 production sessions, 5x output. Context stayed at 65% during heavy editing — and decreased as work progressed.

50-65%
Context Efficiency

Structured specs use 50–65% context versus 100% for prose prompts. After context compaction: zero impact — structured specs maintain full continuity where prose fails.

0
ESLint Rules

106 custom rules existed to catch AI architecture violations. Structured specs eliminate the root cause — agents generate correct patterns from the start.

Config

Preset selection, tag registry init, path resolution. Supports generic, libar-generic, and DDD-ES-CQRS presets.

src/config/
Scanner

File discovery, AST parsing, opt-in detection. Processes TypeScript JSDoc and Gherkin feature files independently.

src/scanner/
Extractor

Pattern extraction, shape resolution, and cross-source merging with conflict detection into ExtractedPattern[].

src/extractor/
Transformer

Builds MasterDataset in a single O(n) pass — pre-computed views, relationship index, and architecture data.

src/generators/pipeline/
Codec

20 Zod codecs transform MasterDataset into RenderableDocument, then to Markdown with progressive disclosure.

src/renderable/
Dual-Source Input → Single Read Model → Living Docs

TypeScript owns runtime structure. Gherkin owns planning and behavior. Each source flows through the four-stage pipeline — Scanner → Extractor → Transformer → Codec — into a single queryable MasterDataset. Neither source duplicates the other. The agent queries truth directly instead of reading both.

1. TypeScript Implementation Stub
/**
 * @libar-docs
 * @libar-docs-status roadmap
 * @libar-docs-uses EventStoreFoundation, Workpool
 * @libar-docs-used-by SagaEngine, CommandOrchestrator
 */
export function durableAppend(
  event: DomainEvent
): Promise<Result> {
  throw new Error('Not implemented');
}
2. Gherkin Specification
@libar-docs
@libar-docs-pattern:EventStoreDurability
@libar-docs-status:roadmap
@libar-docs-phase:18
@libar-docs-depends-on:EventStoreFoundation
Feature: Event Store Durability

  Rule: Appends must survive process crashes

    Given a DomainEvent is ready to commit
    When the process crashes mid-write
    Then the append retries to completion
3. Agent Queries the Truth
# Validate scope before starting a session
$ pnpm process:query -- scope-validate EventStoreDurability implement

# Pull exact context — zero hallucination, minimal tokens
$ pnpm process:query -- context EventStoreDurability --session implement

Your delivery process, queryable.

AI agents parse stale generated docs and hallucinate context. The Data API queries annotated source directly — 26 typed methods, real-time, zero extra tokens.

$ pnpm process:query -- overview
=== PROGRESS ===
318 patterns  224 completed · 47 active · 47 planned  70%

=== ACTIVE PHASES ===
Phase 24: ProcessStateAPIRelationshipQueries
Phase 25: DataAPIStubIntegration

=== BLOCKING ===
StepLintExtendedRules    StepLintVitestCucumber

↑ actual output · delivery-process querying its own 318-pattern source

Three Sessions. Every Pattern.

01

Planning Session

Spec before code

Turn a pattern brief into a structured Gherkin spec. Define deliverables, set the phase number, declare dependencies. No implementation — just a locked roadmap spec ready to drive the next session.

roadmap
02

Design Session

Decisions before implementation

Make architectural choices with options documented as Decision Specs. Create typed TypeScript stubs — interfaces and function signatures that compile but throw. The contract that implementation must fulfill.

roadmap
03

Implementation Session

Code from spec

The spec is the contract. Transition to active, implement each deliverable, run Process Guard at every commit. When all deliverables are complete, transition to completed — hard-locked forever.

roadmapactivecompleted

Four Pillars

01

Stop Scraping. Start Querying.

AI-Native Data API

Traditional AI tools ingest entire repos. process-api feeds curated JSON or text directly to the agent. One call for scope validation, one for a curated context bundle — dependency trees, architectural impact, deliverables. Zero hallucination. 5x fewer tokens.

02

Immutable Delivery Rules

FSM-Enforced Process Guard

Documentation guidelines are ignored; state machines are enforced. A rigid FSM (roadmapactivecompleted) lives in your codebase, validated by a Decider-pattern pre-commit hook. Active specs are scope-locked; completed features cannot be modified without an auditable unlock reason.

03

Dual-Source Truth

Four-Stage Pipeline

TypeScript owns runtime relationships — uses, used-by, category. Gherkin owns planning constraints — status, phase, depends-on. Neither duplicates the other. The four-stage pipeline merges both into a unified MasterDataset, and anti-pattern validators ensure the boundary is never crossed.

04

Codec-Driven Living Documents

Progressive Disclosure & Rich Relationships

20 Zod-validated codecs transform one annotation set into rich Markdown for humans and token-efficient modules for AI. Relationship tags — uses, used-by, implements, depends-on — materialise into live Mermaid architecture diagrams scoped by arch-context or arch-layer. Six diagram types, auto-extracted TypeScript shapes, Definition of Done traceability matrices. One source. Two audiences. Zero duplication.

Model Context Protocol Coming Soon

Expose your entire delivery process state — dependency graphs, roadmap, pattern registry — natively to Cursor and Claude via Anthropic’s MCP standard. No CLI install. No context setup. Just perfect architectural memory.