Practical Applications: Proof of Network
(Part 1 of 3 — same chapter in the PDF; split for the web site.)
Chapter 10 Practical Applications: Proof of Network Strategic Takeaway Infrastructure claims are hypotheses until a production application breaks them. Delula and Sidelines are first-party applications whose operational demands forced the Scrypted Network’s abstractions to be production-grade before anyone else built on them. Infrastructure projects that ship only an API and a dashboard face a credibility gap: the abstractions look clean in documentation but have never absorbed the chaos of a real product. This chapter documents three applications that close that gap—one exercising the orchestration and fulfillment stack, one exercising the verification and training stack, and one exercising the world-model thesis that games are the training ground for agentic intelligence. Each application is to Scrypted what Gmail was to Google’s infrastructure: a first-party product whose operational demands force the platform to mature. 10.1 Delula: consumer AI content creation Delula is a unified AI content creation platform. Users describe an idea in natural language; the system parses intent, generates images and videos through composable AI workflows, and delivers results in seconds (from a pre-generated backlog) or minutes (from a custom generation queue). The platform supports guest and registered users, a credit-based economy, recipe-based workflows, real-time progress updates via WebSocket, and an agentic chat editor with MCP tool invocation. Every Delula feature exercises a Scrypted Network primitive: Delula feature Scrypted primitive exercised Recipe-based generation Recipe DAGs, ingredient composition (Ch 5) Custom generation queue Job lifecycle, shadow graph (Ch 6) Pre-generated backlog pool Materialized-route thinking; pre-computation as fulfillment Multi-provider routing Provider abstraction, adapter pattern (Ch 8) Real-time progress updates SSE / pub-sub job stream (§6.9) Credit-based billing SCRYPTOSHI, cost models, ledger (Ch 7) Content moderation pre-flight Trust & safety, two-stage assessment (Ch 9) LLM-driven intent parsing Open Intents resolution (§11.3) Chat agent with MCP tools Multi-protocol ingress (§11.5) Downstream outage detection Self-healing networks, Sora case study (§2.6) 48
The remainder of this section organizes Delula’s contributions into two arcs: scaling and fulfillment (production failures that shaped the network’s reliability architecture) and dynamic workflows (agentic patterns that proved the network’s composition model). 10.1.1 Scaling arc: failures that shaped the network The Scrypted Network’s four-layer fault tolerance model (Chapter 6, §6.7; Chapter 15, §15.1) did not emerge from theoretical design. It was extracted from Delula’s production incident history. From in-memory queue to persistent shadow graph Delula’s first generation queue was in-memory. When the process restarted—a routine event during deployments—every queued and in-progress job vanished. Users saw generations disappear mid-processing with no error, no refund, no recovery path. The fix: a PostgreSQL-backed queue using FOR UPDATE SKIP LOCKED for atomic dequeue, bounded-parallelism job processing, and a processingJobs guard against duplicate dispatch. This failure is the origin story of the Scrypted Network’s shadow graph (§6.2). The principle—job state must be durable and crash-recoverable—was not a design axiom; it was a production lesson. The database-backed queue also introduced tiered priority (subscribers → credit users → registered free → guests) with aging-based fairness: effective priority decreases as wait time increases, preventing starvation. This maps directly to the network’s attention-auction ranking (§3.5), where willingness-to-pay must be balanced with fairness. Multi-hop provider outage: the Grok 500 cascade Delula’s generation chain runs through multiple hops. When an upstream provider returned HTTP 500 errors, Delula had no detection mechanism and no fallback. Users experienced silent failures—generations that entered “processing” and never completed. The fix shipped five layers: (1) an endpoint-availability registry tracking each gateway/provider pair; (2) automatic detection: N failures from M distinct users in T minutes triggers unavailability; (3) a model fallback map for automatic rerouting, itself checked against the registry; (4) hourly health canaries that restore endpoints on success; (5) user-facing banners with provider-status context. This is the operational proof of the self-healing pattern described in §2.6. When OpenAI cancelled Sora in March 2026, the whitepaper described how recipes should re-resolve to the next available video agent. Delula had already built this mechanism—not because of Sora’s cancellation, but because Grok’s intermittent 500s forced the same design. The detectionthreshold approach also informed the network’s abuse-resistance thinking (§9.6): statistical aggregation that detects provider outages can also detect coordinated abuse patterns. Type-aware recovery and webhook idempotency A video generation stuck in “processing” for over a day exposed two flaws: recovery windows were one-size-fits-all (too aggressive for video, too lenient for text), and parallel webhook/polling paths could race to corrupt state. Type-aware recovery replaced the uniform timeout with per-type windows (60 seconds for text, 30 minutes for video, 24-hour hard cap). Webhook/polling convergence shipped through a single processJobCompletion entry point that checks terminal state before any mutation, enforcing one action, one commit. HMAC-SHA256 signatures, status normalization, and payload-nesting resolution completed the hardening. The one-action-one-commit discipline became the governing principle of the Scrypted orchestration engine (Chapter 6). Type-aware recovery became per-step timeout configuration in recipe 49
definitions. Gateway errors (502, 503, 504) as retriable events informed the fault taxonomy that classifies upstream responses into retriable and terminal categories (§15.1). Pre-computation as fulfillment strategy AI content generation takes seconds to minutes. For a consumer product, any wait destroys engagement. Delula introduced a two-tier system: (1) a pre-generated content pool (the “backlog”) maintained by a background service, delivering near-instant results via atomic FOR UPDATE SKIP LOCKED claims; (2) a custom queue with priority tiers and real-time WebSocket progress updates for parameterized requests. Pre-computation as fulfillment connects to the network’s materialized-route concept (§11.3). A materialized route caches the compiled execution plan; the backlog extends this further by caching the result. For the network, this suggests a tier model where high-frequency intents are fulfilled from pre-computed inventories rather than executing fresh recipes every time—an economic optimization where the network invests compute speculatively to reduce latency for common queries. 10.1.2 Dynamic workflows: from static pipelines to agentic composition Workflow components as proto-ingredients Delula’s pipeline executes a sequence of workflow components—intent parsing, image generation, video generation, media processing—where each component specifies a service, endpoint, parameters, and output mappings. A workflow registry maps types to ordered component arrays; the processor executes them sequentially, passing outputs from earlier steps as inputs to later ones. This is the direct precursor to the network’s recipe structure (§5.2). Delula’s workflow component became the recipe step; the workflow registry became the ingredient/recipe registry; sequential execution became the RecipeExecutorV2’s DAG resolution with parallel dispatch where dependency edges permit (§6.2). Fan-out/fan-in and cancel semantics When a user submits an idea, Delula launches parallel AI tasks: prompt expansion, content moderation, requirements analysis, tag extraction, and suggested actions. A second fan-out (structuring phase) produces intent-core extraction, variable identification, UX element generation, and title/description generation with strict ordering dependencies. The live-analysis fan-out supports replace-latest semantics: if the user modifies input while analysis is running, the stale fan-out is cancelled and a new one launched. This maps to the network’s job cancellation design (§6.8), and the strict ordering maps to depends_on edges in recipe step definitions. The pattern revealed a design challenge: when multiple user actions queue while a fan-out is running, the system must choose between sequential drain (process everything in order) and cancel-and-resubmit (always use the latest state). Different workflow types require different strategies. This nuance informed the network’s per-step execution hints (timeout_ms, retry_policy) that allow recipe authors to specify cancellation and supersession behavior. Quality validation and agentic tool invocation The “DIRECTOR” system—structured rules plus LLM-powered validation—assesses recipe quality before generation. Rule definitions, LLM validation with structured pass/fail scoring, and a real-time validation API endpoint compose into a quality gate that is not a separate process but a composable step insertable into any workflow. This validates a key network thesis: 50
quality assurance can be structural rather than aspirational, extending the safety-check pattern (Chapter 9) from compliance to creative quality. Delula’s chat editor extends further: the agent can emit structured recipe updates and invoke MCP tools that modify UX elements programmatically. The agent receives full recipe context—draft, rules, available components—and produces both human-readable responses and machine-executable tool calls. This is the production proof of multi-protocol ingress (§11.5): an AI agent receiving natural language, reasoning in context, and producing both conversation and action. Provider abstraction under real heterogeneity Every Delula generation request routes through the Scrypted API via a vendor SDK that provides recipe invocation, job-status polling, response normalization, and a typed exception hierarchy (auth, validation, payment, rate limit, network, timeout). The integration experience documents every normalization challenge: status casing varies (FAILED vs Completed vs failed); results nest at different depths; images appear at different key paths per provider. This SDK is the living prototype of the BaseProviderAdapter pattern (Chapter 8, §8.1). Every normalization issue Delula encountered informed the network’s adapter contract. The exception hierarchy maps to the fault taxonomy. The idempotency pattern—pass a key on invoke, check on receipt—is the production template for the exactly-once illusion maintained by idempotent side effects (§15.3). 10.1.3 What Delula proves—and what it has not Validated:
- Recipe composition works in production. Users generate content through multi-step workflows daily. The workflow-component pattern handles real provider heterogeneity (FAL, Grok, Sora, Veo, Bedrock) without users knowing which provider executed their request.
- Provider abstraction enables self-healing. When Grok goes down, fallback routing is automatic. When Sora was cancelled, affected workflows were rerouted. The abstraction has been exercised in production, not just theorized.
- Agentic workflows outperform static pipelines. Fan-out analysis, DIRECTOR validation, and chat-agent iteration produce richer context than any single LLM call.
- Pre-computation is a viable fulfillment strategy. The backlog system delivers nearinstant results from a pre-generated pool, proving speculative compute can dramatically improve user experience.
- Operational complexity is real and requires dedicated infrastructure. Job recovery, outage detection, cross-server coordination, type-aware timeouts, distributed locking—these consume significant engineering effort and represent the actual value of a managed orchestration platform versus raw API access. Honest gaps: • Decentralized execution: Delula runs on a single operator’s infrastructure. Multi-operator, multi-region execution with federated coordination (Chapter 15, §15.5) remains unexercised. • On-chain identity: Providers are configured in YAML, not registered on-chain with ERC- 8004 identities and reputation scores (Chapter 11). • Cryptographic verification: CRPC (Chapter 12) has not been exercised in Delula’s production pipeline. Output verification remains trust-based. 51
Source: transcribed from the compiled Scrypted Network Design whitepaper PDF for web reading. Layout, figures, and pagination may differ from the PDF.