The Stack · Issue #001 March 31, 2026 · ~6 min read
Okay so. First issue. No warmup, no "I'm so excited to launch this" preamble. Just: here's what happened in AI this month, here's what I actually think about it, and here's the one thing you should probably do with this information. That's the format. Every Tuesday. Let's go.
The labs are releasing at a pace that's genuinely hard to keep up with, and that's kind of the point. Each one lands with a benchmark table and a press cycle, and by the time you've integrated it, the next one's already in the pipeline. Here's what the benchmarks aren't telling you: at the frontier, these things are getting really hard to differentiate on real-world tasks. The gap is closing. That's not bad news — it's useful information. It means the model choice is mattering less. What you do with the model is mattering more.
Everyone's talking about the valuation ($840B post-money, most valuable private company ever). I get it, that number is nuts. But the part that matters operationally: Amazon became OpenAI's exclusive cloud partner for the foreseeable future. That's not a partnership announcement. That's AWS capturing a generation of AI workloads by writing a check instead of winning a technical argument. They've done this before. Usually works.
The Model Context Protocol crossed a milestone this month that didn't get nearly enough attention. 97 million installs means this is no longer "interesting developer experiment" territory — it's becoming foundational infrastructure. If you're building agents and you're not on MCP, you're building something that will either migrate or get replaced. I've been on it since early beta. The developer experience has improved dramatically. This is the thing to pay attention to.
Grok plus orbital infrastructure plus Starlink compute plus X's data moat, all in one company. Set aside your feelings about the CEO for a second and just look at the surface area: no company in history has been this vertically integrated across AI, compute, and distribution. Whether that's a strength or a liability depends entirely on execution, and execution at this scale is an unsolved problem. Worth watching closely. IPO in June if it holds together.
OpenAI discontinued the Sora API this month. Remember when Sora was going to change filmmaking forever? That was about 18 months ago. The actual video AI market in 2026 is Runway, Kling, Pika, and a dozen others who were building product while Sora was winning press awards. Lesson, again: the demo that breaks the internet is not the product that wins the market. Being first with the announcement and first with the working product are different races.
The infrastructure layer is eating the model layer. Here's the mechanism.
Every previous technology wave has followed this pattern: the exciting new thing captures attention, gets funded, gets iterated on, and eventually gets commoditized. Then the boring infrastructure underneath it becomes the durable business.
Databases. Cloud compute. Mobile app stores. Payments. Now AI models.
The companies that win aren't always the ones with the best technology at any given moment. They're the ones that own the layer everyone else has to build on. AWS didn't win because they had the best virtual machines. They won because they built the tooling, the integrations, the ecosystem, and the trust.
I'm building in this space every day and my working thesis is: the next five years are not about which model is 3% better on MMLU. They're about who owns the orchestration layer, the agent tooling, the eval infrastructure, and the deployment surface. That's where I'm putting my attention. That's what I'm building.
Tool: If you haven't looked at MCP tooling in the last 60 days, look now. The ecosystem has matured significantly. Start with the official spec at modelcontextprotocol.io and then look at what's been built on top of it. 97 million installs means there are real production implementations to learn from.
Framing: Stop asking "which model should I use" as if it's a permanent decision. Start asking "how do I build something that works with the best available model at any given moment without a migration tax." That's an architecture question, not a vendor question.
That's Issue #001. If something in here was wrong, useful, or made you think differently about something — I actually want to know. Hit reply or email rock@f-me.ai.
— Rock
Built on AWS · Powered by Bedrock · Published from somewhere in us-east-1
The Stack
Every Tuesday. What's actually happening in AI and tech, from someone building in it.
No spam. Unsubscribe anytime.