The potential of generative AI and large language models (LLMs) for enterprises is massive.
We’ve talked about this opportunity before and at Summit 2023, we announced a number of capabilities that come together to help our customers bring generative AI and LLMs directly to their proprietary data, all delivered through a single, secure platform.
Snowflake’s single platform has already helped our customers break down data silos, and it enabled them to bring more types of development directly to their data. This includes the ability to run and fine-tune leading LLMs in Snowflake using Snowpark Container Services, get smarter about your data with built-in LLMs, and boost productivity with LLM-powered experiences. Read on to learn more about these announcements.
At Summit, we announced Snowpark Container Services (private preview), which enables developers to effortlessly register and deploy containerized data apps using secure Snowflake-managed infrastructure with configurable hardware options such as accelerated computing with NVIDIA GPUs. This additional flexibility drastically expands the scope of AI/ML and app workloads that can be brought directly to Snowflake data.
To make it even easier and secure for customers to take advantage of leading LLMs, Snowpark Container Services can be used as part of a Snowflake Native App, so customers will be able to get direct access to leading LLMs via the Snowflake Marketplace and installed to run entirely in their Snowflake accounts. Initial commercial LLMs include AI21 Labs, Reka or the NVIDIA (NeMo framework, part of the NVIDIA AI Enterprise software platform).
For these providers, their LLM weights or other proprietary IP are not exposed to the app consumer, as the logic and data in the Snowflake Native App is not accessible to the end-consumer even when the app is deployed and running in Snowpark Container Services on the end-consumer’s Snowflake account. And since the LLM runs inside the end-consumer’s account, governed enterprise data used for fine-tuning or other interaction with the LLM is never exposed back to the provider. A win/win for both parties. Check out this demo to see this in action.
Streamlit has always had the vision to be an easy and delightful way to bring data and AI/ML models to life as interactive applications built with Python. And with LLMs, it’s no different. Streamlit has rapidly become the de facto way to build UIs for LLM-powered apps. In fact, over 7,000 LLM-powered Streamlit apps have already been created on the Community Cloud, with that number growing every day. Over 190,000 snippets of Streamlit code (and counting) exist on GitHub alone, all of which help interact GPT4 and other LLMs. This means that analysts, data scientists, and even students can quickly perform analyses, prototype new apps, and weave auto-generated Streamlit fragments throughout other apps.
With the integration of Streamlit in Snowflake (public preview soon), Snowflake customers can use Streamlit to develop and deploy powerful UIs for their LLM-powered apps and experiences entirely in Snowflake. This empowers teams to effortlessly share apps using existing Snowflake governance and quickly deploy those apps without operational burden because it is all running in Snowflake platform.
Additionally, Snowflake is building LLMs directly into the platform to help customers boost productivity and unlock new insights from their data. One area that’s been especially challenging for customers is understanding and extracting value from unstructured data, such as documents. With Document AI (private preview), Snowflake is providing a leading LLM to help customers quickly and easily extract information from documents. See it in action here.
Document AI leverages a purpose-built, multimodal LLM that is natively integrated within the Snowflake platform. Customers can easily and securely extract content, such as invoice amounts or contractual terms from documents, and fine-tune results using a visual interface and natural language. Since it’s all part of Snowflake’s single platform, data engineers and developers can also perform inference by programmatically calling the built-in or fine-tuned models, like in pipelines with Streams and Tasks or in applications.
As we’ve seen across the industry, LLMs can also be a powerful way to increase user productivity, from reducing the need for manual coding or aiding in discoverability. At Summit, we showcased a number of enhancements currently in development to bring LLM-powered experiences to Snowflake customers. This includes conversational search experiences to help customers discover data and apps based on business questions in the Snowflake Marketplace, as well as conversational text-to-code capabilities to make it easier for users to query data and discover new insights in Snowsight. We will continue to expand in this area to make it even easier for anyone, even non-coders, to discover and get value from their data.
Generative AI is causing a fundamental shift across software and enterprises but the journey is just getting started. Stay connected to our developer site to learn more about how you can use Snowflake to bring the power of LLMs to your enterprise.
Learn More: Learn more about how Snowflake is building a data-centric platform for generative AI and LLM. Read this blog.
The post Bring Gen AI & LLMs to Your Data appeared first on Snowflake.
Yes, AI can play a significant role in improving the efficiency of detecting and preventing cyberattacks, potentially curbing their impact. […]
Many of our customers — from Marriott to AT&T — start their journey with the Snowflake AI Data Cloud by […]
The rise of generative AI models are spurring organizations to incorporate AI and large language models (LLMs) into their business […]