← Blog

The AI Application Layer Is Still Wide Open

Why the best AI products aren't the ones with the biggest models—they're the ones that nail the workflow, distribution, and data.

The model layer is consolidating: a handful of labs, a few APIs, and routing layers like OpenRouter. The application layer—what users actually touch—is still wide open. Most of the value will get captured there.

Models are a commodity (in the right sense)

Access to good models is no longer the moat. You can call GPT-4, Claude, or a dozen others through one API. The differentiator isn't "we have a model;" it's "we have a product that does X well." The app defines the workflow, the UX, and the integration with the rest of the stack. That's where defensibility is.

Workflow and distribution

The best AI products will be the ones that own a workflow: code review, support triage, content ops, internal tools. They'll use whatever models work best that month. The moat is distribution (users, integrations, data) and workflow design, not the underlying model. Same story as earlier platform waves: the infra gets standardized; the apps that sit on top are where the variety and the value live.

Data and feedback loops

Products that accumulate data—user behavior, corrections, outcomes—can tune prompts, evals, and routing in ways competitors can't copy. The application layer is where that feedback loop lives. Generic chat UIs don't get that; vertical apps do.

What's still open

Vertical SaaS with AI baked in. Dev tools (evals, observability, debugging). Internal tools and workflows. Agents that do one job well and plug into existing systems. All of that is application-layer work. The model is an input; the product is the thing you build on top.

If you're waiting for the "AI space" to be figured out, don't. The application layer is still wide open. The teams that pick a workflow, own the data, and ship a great product will win—regardless of which model they call.