Cortex Code Expands: One Governed Agent for Your Entire Data Stack, Everywhere You Work

Get hands-on

The best way to see what Cortex Code can do is to run it in your own Snowflake environment. Start with the official documentation and the getting started quickstart. Non-Snowflake users can try Cortex Code CLI trial to build across their data stack.

Forward-looking statements

This article contains forward-looking statements, including about our future product offerings, and are not commitments to deliver any product offerings. Actual results and offerings may differ and are subject to known and unknown risk and uncertainties. See our latest 10-Q for more information.

Cortex Code in Snowsight is now powered by Cloud Agents

Snowsight users can now start to experience the full Cortex Code capabilities through our Cloud Agents (in private preview).

Cloud Agents provide the same agent loop, skills, tool execution and runtime as Cortex Code CLI inside of a secure and tightly managed cloud environment. For Snowsight users this means Cortex Code can now run a larger set of tools to accomplish its tasks, including browsing the web, accessing a persistent file system and executing arbitrary code. Zero installation is required. All the compute is dedicated and isolated and runs inside Snowflake, ensuring that the same governance framework that governs your data governs your agent.

The sandbox particularly unlocks long-running data workflows such as pipeline builds that continue in the background, complex debugging sessions that persist across days and scheduled agent tasks that run without a human at the keyboard.

More transparent and interactive experience in Snowsight

We’re also introducing two major user experience improvements to Cortex Code in Snowsight: Plan Mode and Snap & Ask.

Plan Mode brings structured, editable planning to Cortex Code in Snowsight. Before executing, Cortex Code researches your actual data environment — schemas, pipelines, catalog — and generates a plan for you to review and edit. Users see exactly what the agent will do and why, and they can approve, redirect or refine before a single line runs. For teams adopting agentic workflows at scale, Plan Mode is the on-ramp that builds trust — the same way diffs and code review build trust before deployment.

Snap & Ask lets users interact directly with what they see on-screen across Snowsight. It is currently available on the Anomaly screen and coming soon to DCR, Query History, Performance Explorer, Streamlit and Notebooks. 

You can select any chart, DAG, table or card container and ask questions about it without needing to describe what they see. Snap the view, ask a question, and get an answer grounded in your actual live data context.

New agent skills make Cortex Code the agent for data engineers

Data engineering is a core strength of Cortex Code, and skills push this further with expert workflows for dbt, data quality, lineage, cost intelligence and more. Run /skill list to explore the full list. This month we are announcing the following:

  • snowpark-python covers the full Python pipeline lifecycle: authoring Snowpark code, deploying pipelines to Snowflake, setting up CI/CD and enabling observability. Engineers get expert guidance on the patterns that matter, from UDF design to deployment best practices, without leaving their workflow.

  • snowpark-connect focuses on migrating existing Apache SparkTM workloads to Snowflake. For teams running Spark workloads today and evaluating Snowflake as a platform, the Snowpark Connect skill provides a structured migration path: analyzing existing Spark code, identifying compatibility patterns and applying targeted fixes to help teams run their Spark code directly on Snowflake with minimal changes.

  • dbt-projects-on-snowflake covers deploying and managing dbt projects as native Snowflake objects — from initial deployment and versioning to execution, documentation generation and task-based scheduling. Data engineers get expert guidance working with specific dbt syntax and knowledge of both Snowflake and dbt best practices.

  • dcm covers the full lifecycle of DCM Projects in Snowflake: authoring manifest files, managing schema and table definitions, and declaratively managing database objects. Teams get structured guidance on treating database schema as versioned, declarative artifacts — from initial project setup through ongoing change management.

An agent platform that meets you where you work

Cortex Code is now an agent platform.

With the Cortex Code Agent SDK, teams can program Cortex Code in Python and TypeScript and embed its governed, context-aware capabilities into their own tools, apps and workflows in headless mode.

Cortex Code is now available as a native VS Code extension (private preview), Model Context Protocol (MCP) server and Claude Code plugin (preview), bringing data-native AI coding into VS Code, Cursor, Claude Code and other coding agents. Cortex Code also supports the Agent Client Protocol (ACP), the open standard for connecting AI coding agents to editors. That means developers can use editors such as Zed, JetBrains, Emacs and more than 30 other ACP-compatible editors to work with Snowflake data directly. Developers get the full power of Cortex Code exactly where code is written, reviewed and refined, without breaking flow or switching context.

Developers are moving faster with data than ever before, building across a modern data stack that spans warehouses, transformation tools and orchestration layers. But while their workflows are deeply data aware, many AI coding assistants are still built for general-purpose software development. They can generate code, but they lack the context that matters in data work: your schemas, catalog, lineage and access policies. The result is code that may look promising at first glance but still needs substantial human review and correction before it’s production ready.

After launching Cortex Code earlier this year as a Snowflake-aware AI coding agent, we quickly expanded it into an agent that understands not just your data in Snowflake but your extensive data stack. Support for dbt and Apache Airflow marked the first step in that evolution. In just two months more than 50% of our customers are using Cortex Code. Every data practitioner, from data engineers and analysts to ML engineers, is building faster because of the data-native agent harness and environment awareness from the first prompt.

Today, we are taking the next step. Cortex Code now supports additional data systems, works natively in the tools developers already use and is available as a programmable platform that teams can embed in their own workflows. Additionally, new capabilities bring greater transparency, visual intelligence and full agentic execution — the power of Cortex Code to every user within the Snowsight interface.

One governed agent. Your entire data stack. Every environment your team already works in.

Now you can build on any data platform with Cortex Code

While working with multiple systems, data engineers spend enormous amounts of time reestablishing context every time they move between systems and untold hours debugging logs across systems to pinpoint failures or prepare optimizations. This challenge is compounded by the fact that teams use different tools for different systems, making it even harder to maintain context and build efficiently.

Whether you are building an Apache SparkTM pipeline, managing dbt models, querying an operational database or setting up an Apache IcebergTM catalog integration, Cortex Code makes it simple, seamless and faster with support for AWS Glue, Databricks and Postgres. 

For data teams working on the full stack, this represents a fundamental shift: one agent that knows your entire data footprint and context, wherever your data and coding logic lives. 

LATEST ARTICLE

See Our Latest

Blog Posts

admin April 22nd, 2026

A single interface for enterprise intelligence and action At enterprise scale, the hardest problem is not the intelligence itself. The […]

admin April 22nd, 2026

Snowflake Intelligence is redefining how business users work with data—turning insights into decisions and actions across the enterprise. Powering this […]