Securely Deploy Custom Apps and Models with Snowpark Container Services, Now Generally Available

Since introducing Snowpark Container Services, we’ve seen overwhelming adoption across industries from customers and partners, including Landing.AI, Relational.AI, H20.AI, SailPoint, AIR MILES, Spark NZ, and Eutelsat OneWeb. These organizations and many more are using Snowpark Container Services capabilities to easily and securely deploy everything from custom front-ends and large-scale ML training and inference to open source and homegrown models, all securely within Snowflake.

Today we are excited to announce the general availability (GA) of Snowpark Container Services in all AWS commercial regions and Azure Public Preview in all Azure commercial regions. Customers can get fast access to GPU infrastructure without needing to self-procure instances or make reservations with their public cloud provider. (GPU availability may be limited in certain regions.) 

And with such widespread adoption of the feature, we’re excited to announce that we’ve lowered costs by 50% across all instance types!   

Interested in learning more about why we created Snowpark Container Services? Check out this blog.

Deploy custom workloads securely, without complexity

Security, simplicity and value: This is why customers are so excited about Snowpark Container Services. 

First, security. Snowpark Container Services gives developers the ability to bring any containerized workload to their data that is already secure in Snowflake — ReactJS front-ends, open source large language models (LLMs), distributed data processing pipelines, you name it. Data doesn’t need to move across a patchwork of services, opening up security and governance risks; it can stay within Snowflake while you analyze it, transform it and build with it. 

Second, simplicity. Stitching together various container registries, management services, compute services and observability tools is complicated. It creates maintenance overhead for developers and adds complexity to architectures. Snowpark Container Services makes it simple. It is a fully managed service that provides a single, integrated experience. 

Third, value. The simplicity of a fully managed service reduces overhead and operational burden, maximizing the value you get out of the service. Additionally, we’ve added budget controls that enable you to monitor and manage resources cost effectively. And as we mentioned previously, we’re excited to reduce costs for all instance types: See table 1(b) in the Snowflake rate card.

What’s new in GA? 

In addition to making Snowpark Container Services generally available, through close partnership with our design partners throughout preview, we’ve further advanced Snowpark Container Services across the following key areas.

  • Improved security and governance: We enhanced control over security aspects, including egress, ingress and networking. Register here to learn more about this in our on-demand security deep dive.
  • Increased storage options: We added more diverse storage solutions, including local volumes, memory, Snowflake stages and configurable block storage, to support additional use cases, such as deploying high-performance LLMs and low-latency applications.
  • More diverse instance types: We introduced high-memory instances and dynamic GPU allocation for intensive workloads.
  • More flexible, GPU-powered compute in Snowflake Notebooks: Container Runtime (currently in private preview) provides seamless access to distributed processing with CPU and GPU options, which is ideal for resource-intensive machine learning tasks, such as deep learning. Users can get started with the Container Runtime directly from Snowflake Notebooks (currently in public preview) with optimized data loading from Snowflake, automatic lineage capture and Model Registry integration.
  • Observability with Snowflake Trail: With Snowflake Trail, you can get a comprehensive set of telemetry signals, including metrics, logs and traces, all within Snowflake. Built with OpenTelemetry standards, schema and open ecosystem integrations in mind, Snowflake telemetry and notification capabilities integrate with some of the most popular developer tools, including Datadog, Grafana, Metaplane, Monte Carlo, PagerDuty and Slack. 
  • Streamlined DevOps: With GA, Snowpark Container Services supports programmatic ingress, spec templating and integration of jobs with services that will help automate software development and IT operations.

Get started with Snowpark Container Services

Here are a few resources to help you get started:

  • Deploy your first container in Snowflake with this quickstart (Note: Snowpark Container Services is not available for free trial accounts). 
  • Get notified when new regions become available through Github
  • Learn more about Snowpark Container Services in our documentation.
  • Watch the tech talk to learn how Landing.AI is effortlessly and securely deploying large vision models in Snowflake. 
  • Check out this YouTube Playlist full of demos of developers using Snowpark Container Services for everything from call center analytics to drug discovery, to running Doom within Snowflake! 

We look forward to seeing all the cool things you build in the AI Data Cloud with Snowpark Container Services.

The post Securely Deploy Custom Apps and Models with Snowpark Container Services, Now Generally Available appeared first on Snowflake.

LATEST ARTICLE

See Our Latest

Blog Posts

admin December 11th, 2024

We had a busy week catching up with customers and partners at Microsoft Ignite in Chicago and online. We shared […]

admin December 11th, 2024

At Snowflake, we’re committed to delivering consistent, automatic performance enhancements. We work behind the scenes to make your data operations […]

admin December 11th, 2024

For years, companies have operated under the prevailing notion that AI is reserved only for the corporate giants — the […]