Independent · AI Systems

I don't add AI
to your product.
I design products
around AI.

Most engineering teams treat LLMs as a layer on top of existing architecture. That's the wrong starting point. I design systems where the model is the operating core — with real tool access, structured reasoning, and autonomous action built in from day one.

Let's Talk →

Systems that think. Software that acts.

The approach

The gap isn't capability.
It's architecture.

Foundation models are powerful enough to replace entire workflow layers — but only if your system is designed to let them.

The typical playbook
01Build the CRUD app, add an LLM widget at the end
02The model has no access to real systems
03Users ignore the chatbot, navigate around it
04AI is a feature on the roadmap
How I build
01Design the agentic interface first
02The model has tool-calling access to your data and APIs
03Users give intent once — the agent reasons and delivers
04AI is the architecture
What I build

What this looks like
in practice

End-to-end from first commit to production — AI-native from the ground up.

Reasoning systems, not chatbots

Products where an LLM has genuine tool access — to your databases, APIs, and external services — and uses it autonomously. Users state intent. The system acts.

Serverless, built for models

Streaming, WebSocket-based interfaces. Async background agents. Multi-model architectures that use the right model for the right task. Infrastructure that doesn't bottleneck your AI.

Architecture that doesn't age badly

Model selection, RAG design, agentic loop architecture, tool schema design. I work with engineering leads early — the decisions made in week two are the ones you live with in year two.

Work

Built,
not theorized.

Everything here comes from systems I've designed and shipped — streaming agentic interfaces, async background reasoning pipelines, multi-model architectures with structured output extraction, real-time tool-calling over live data.

I don't consult on AI from the outside. The stack I recommend is the stack I build with.

BudgeEarly Access

An AI-native travel research app. Multi-turn agentic conversations with live web search, streaming responses, async preference extraction, and structured plan generation.

Claude Sonnet via Amazon Bedrock
WebSocket streaming + tool calling
Async multi-model pipeline (Nova Lite)
Next.js · AWS Serverless · Terraform
Open Budge ↗

Building something that actually needs to think?

If you're an engineering leader who's done with demo-ware and wants LLMs doing real work inside your product — let's get into it.

Work With Me →