Enterprise software
approved by Eidos
The system planned the workflow. No one drew it.
The Eidos system autonomously planned and executed a 10-step approval pipeline for a $48,000/year security platform — investigation, competitive analysis, cost justification, compliance review, presentation, and rollout. The human approved the final decision. Eidos did everything else.
The challenge
A team needed enterprise security software. The approval process at any organization is a gauntlet: cost justification, competitive analysis, standards compliance, executive presentation, procurement, rollout, verification. Each step produces documents that feed the next. Dependencies are implicit. The order matters.
Usually this takes a project manager, two to three weeks, and six meetings. Someone draws the workflow on a whiteboard. Someone else tracks status in a spreadsheet. Half the work is coordination.
This time, no one drew anything.
What Eidos built
Eidos read the organization's contracts — what "approved" means, what "compliant" means, what "justified" means — and planned the entire pipeline autonomously. It identified dependencies, parallelism, and the single human gate.
Eidos Human
No one drew this
Eidos was not given this workflow. No human sketched it on a whiteboard, configured it in a workflow tool, or wrote it in a project plan. The system read the contracts — what "approved" means, what "compliant" means, what "cost-justified" means — and planned the execution order itself.
It identified that investigation must come first. It recognized that standards, competitive, and cost analysis can run in parallel — they don't depend on each other, only on the investigation. It knew the SCR draft can't start until all three analyses complete. It placed the human decision gate at exactly the right point — after all evidence is assembled, before any money is spent.
This looks like n8n. It looks like Zapier. It looks like a workflow someone configured. But no one configured it. The graph emerged from the contracts.
What Eidos produced
Ten stages. Nine executed by Eidos. One human decision. At each node, the system produced a concrete deliverable — not summaries, but documents a decision-maker could act on.
- Investigation: Product capabilities mapped to organizational gaps. What the tool does, why the team needs it, what happens without it.
- Standards alignment: Expert scorecard across SOC 2, NIST 800-53, ISO 27001, and CIS Controls. Not "we should do this" — "here's which controls require it."
- Competitive analysis: Four alternatives evaluated on capability, cost, and fit. Clear winner with rationale.
- Cost analysis: Total cost of ownership with context — this tool costs less per month than the organization's coffee budget. Framed for a non-technical decision-maker.
- SCR document: Formal spend case review. Executive summary, business justification, risk assessment, implementation plan. Print-ready.
- Presentation: One-page executive brief. The decision-maker reads this, not the 15-page SCR.
The leverage progression
The comparison isn't AI versus no-AI. It's how each level of AI sophistication compounds the leverage on the same task.
| Traditional | Claude | Claude Code | Eidos | |
|---|---|---|---|---|
| Who plans the work | Project manager draws the workflow | Human plans with AI drafting help | Human directs AI to execute steps | System reads contracts, discovers the graph |
| Who executes | 3 people across 2-3 weeks | 1 person using AI for drafts and research | AI executes, human reviews each step | 9 stages autonomous, 1 human decision |
| Time to approval | 2-3 weeks | 3-5 days | 1-2 days | Hours |
| Human decisions | Dozens | Several | A few | One |
| Leverage | 1x | 3-5x | 10-25x | 50-100x |
| When requirements change | Redraw the workflow, reassign tasks | Rewrite the prompts | Rewrite the prompts | Contracts change, graph adapts automatically |
| What persists | Institutional memory, if you're lucky | Chat history | Code and scripts | Contracts, DAGs, and decision records |
Contracts, not workflows
Workflow tools require a human to design the graph in advance. When the process changes — new compliance requirements, new approval thresholds, new stakeholders — someone reconfigures the tool. The workflow is the bottleneck.
In a contract-driven system, Eidos reads the contracts and discovers the graph. When the contracts change, the graph changes. No one reconfigures anything. The system adapts because the constraints adapted.
This is the difference between automated workflow and autonomous workflow. Automated means a human designed the machine and it runs. Autonomous means the system reads the goal and figures out the machine.
The contract that started it
The entire 10-stage pipeline grew from a three-line job:
source: security-platform-evaluation.md
transcoder: enterprise-approval
context: "New security tool needed. Budget
requires director approval. Must align
to existing compliance frameworks." Three lines. Eidos read the contract for what "approved" means, discovered the graph, executed nine stages autonomously, and presented one decision to a human. The human said yes.