Skip to main content
  1. Blog/

Microsoft Build 2023 — The Copilot Stack and Azure AI's Big Bet

·911 words·5 mins
Osmond van Hemert
Author
Osmond van Hemert
Table of Contents
AI Industry & Regulation - This article is part of a series.
Part : This Article

Microsoft Build 2023 just concluded, and after watching the keynotes and digging through the session catalog, one thing is clear: Microsoft is not treating AI as a feature. They’re treating it as the platform. Satya Nadella used the phrase “Copilot stack” repeatedly, describing a layered architecture for building AI-powered applications. Having worked with Azure in various capacities over the years, this feels like the most significant strategic pivot since the original cloud push under Nadella’s leadership.

The Copilot Stack Explained
#

The Copilot Stack is Microsoft’s framework for how AI applications should be architected. From bottom to top: infrastructure (Azure AI compute), foundation models (OpenAI models via Azure), an AI orchestration layer (centered on Semantic Kernel and prompt management), data grounding (connecting models to your specific data via embeddings and retrieval), and finally the Copilot user experience layer.

What makes this interesting from a DevOps and infrastructure perspective is the orchestration layer. Microsoft is pushing Semantic Kernel — their open-source SDK for integrating LLMs into applications — as the standard way to build AI-augmented workflows. If you’ve been using LangChain, Semantic Kernel is Microsoft’s answer, with tighter Azure integration and a more opinionated architecture.

The plugin system is the other major piece. Microsoft announced that ChatGPT plugins and Bing plugins will be interoperable with Microsoft 365 Copilot. Build a plugin once, and it can surface across ChatGPT, Bing, and the entire Microsoft 365 suite. For enterprise developers, this cross-platform plugin story is genuinely compelling — you don’t want to build and maintain separate integrations for every AI surface.

Azure AI Infrastructure Updates
#

For teams running AI workloads in production, Build brought several meaningful infrastructure announcements. Azure AI Studio is a new unified portal for building generative AI applications, combining model deployment, prompt engineering, and evaluation tools in one interface.

The most practically useful announcement for my work is the Azure OpenAI Service updates. GPT-4 is now generally available on Azure, with enterprise features that the direct OpenAI API doesn’t offer: virtual network support, managed identity authentication, content filtering, and data residency guarantees. If you’ve been hesitant about using OpenAI’s API for production enterprise workloads due to compliance concerns, Azure’s wrapper addresses most of those issues.

There’s also Prompt Flow, a new tool for building, evaluating, and deploying prompt-based applications. It integrates with Azure DevOps and GitHub Actions for CI/CD of AI applications. The idea of having automated testing for prompt quality in your deployment pipeline is something I’ve been implementing manually — having platform support for this is welcome.

Dev Tools: GitHub Copilot Chat and Dev Home
#

GitHub Copilot Chat is moving out of private preview into public preview. Unlike the original Copilot inline completion, Chat provides a conversational interface within VS Code and Visual Studio for asking questions about your codebase, generating tests, explaining code, and suggesting fixes for errors.

I’ve been using the original Copilot for over a year now, and the chat interface adds a genuinely different dimension. Inline completions are great for the “I know what I want to write” flow. Chat is better for the “I’m not sure how to approach this” flow. Having both available in the same editor is a productivity multiplier.

Microsoft also announced Dev Home, a new Windows application for developer machine setup. It connects to GitHub, manages development environments, and provides a dashboard for monitoring projects. It’s open source and extensible. After spending too many hours of my career setting up development environments on new machines, anything that streamlines that process gets my attention.

Fabric and the Data Platform Story
#

Microsoft Fabric is a unified analytics platform that merges Power BI, Azure Synapse, and Azure Data Factory into a single SaaS product. While this might seem tangential to the AI story, it’s actually central — the “data grounding” layer of the Copilot Stack depends on having your organizational data accessible, indexed, and embeddings-ready.

Fabric introduces a lakehouse architecture with OneLake, a single unified data lake for the entire organization. Every Fabric workload — data engineering, data warehousing, real-time analytics, data science — works against the same underlying storage. The AI integration comes through Copilot in Fabric, which can generate data pipelines, write SQL queries, and create Power BI reports from natural language descriptions.

My Take
#

What impresses me about Microsoft’s Build announcements isn’t any single product — it’s the coherence of the overall platform story. Google I/O last week felt like “we added AI to everything.” Build feels like “we designed a platform for building AI applications, and here’s how every piece fits together.”

The Copilot Stack architecture is the most clear-headed framework I’ve seen for thinking about AI application development. The separation between foundation models, orchestration, data grounding, and UX mirrors how well-architected traditional applications separate concerns — and it gives teams clear layers to work on independently.

From a practical standpoint, if you’re an Azure shop, the path to building AI-powered applications just got significantly smoother. The Azure OpenAI Service with enterprise controls, Semantic Kernel for orchestration, Prompt Flow for testing, and Azure DevOps integration for CI/CD creates a complete pipeline. If you’re not an Azure shop, this level of platform integration is worth evaluating whether you should be.

The competitive dynamics are fascinating. Microsoft has the enterprise distribution, the OpenAI partnership for models, and the developer tools with GitHub and VS Code. That’s a formidable combination that neither Google nor AWS can fully match right now. The next year is going to be intense.

AI Industry & Regulation - This article is part of a series.
Part : This Article

Related