Latest posts

Content filtering breakthrough: Snorkel client reaches 96% recall in 3 days

Snorkel AI helped a client solve the challenge of social media content filtering quickly and sustainably. Here’s how.

gabe smith headshot
March 26, 2024

Here’s how Snorkel Flow + Google AI built an enterprise-ready model in a day

Google and Snorkel AI customized PaLM 2 using domain expertise and data development to improve performance by 38 F1 points in a matter of hours.

March 19, 2024

Snorkel teams with Microsoft to showcase new AI research at NVIDIA GTC

Microsoft infrastructure facilitates Snorkel AI research experiments, including our recent high rank on the AlpacaEval 2.0 LLM leaderboard.

How Skill-it! enables faster, better LLM training

Humans learn tasks better when taught in a logical order. So do LLMs. Researchers developed a way to exploit this tendency called “Skill-it!”

March 12, 2024

Fine-tuned representation models boost LLM systems. Here’s how

Fine-tuned representation models are often the most effective way to boost the performance of AI applications. Learn why.

trung nguyen headshot
March 5, 2024

Enterprise GenAI to surge in 2024: survey results

Enterprise GenAI 2024: applications will likely surge toward production, according to Snorkel AI Enterprise LLM Summit survey results .

February 29, 2024

Large language model training: how three training phases shape LLMs

Training large language models is a multi-layered stack of processes, each with its unique role and contribution to the model’s performance.

stephen bach (steve bach)
February 27, 2024

LoRA: Low-Rank Adaptation for LLMs

Low-rank adaptation (LoRA) lets data scientists customize GenAI models like LLMs faster than traditional full fine-tuning methods.

February 21, 2024

LLM distillation demystified: a complete guide

LLM distillation isolates task-specific LLM performance and mirrors it in a smaller format—creating faster and cheaper performance.

February 13, 2024

Enterprises must shift their focus from models to data in AI development

Snorkel AI CEO Alex Ratner explains his view on the importance of AI in data development and illustrates his position with two case studies.

February 9, 2024

Insurance’s GenAI revolution: a business perspective

Snorkel CEO Alex Ratner talks with QBE Ventures’ Alex Taylor about the future of AI, LLMs and multimodal models in the insurance industry.

Dr. Bubbles, Snorkel AI's mascot
February 6, 2024

Scaling human preferences in AI: Snorkel’s programmatic approach

We’ve developed new approaches to scale human preferences and align LLM output to enterprise users’ expectations by magnifying SME impact.

Hoang Tran portrayed.
January 31, 2024

Building better enterprise AI: incorporating expert feedback in system development

Enterprises that aim to build valuable GenAI applications must view them from a systems-level. LLMs are just one part of an ecosystem.

January 30, 2024

“Fall in love with your data”—Snorkel AI’s Enterprise LLM Summit

Snorkel AI’s Jan. 25 Enterprise LLM Summit focused on one theme: AI data development drives enterprise AI success.

January 26, 2024

Why QBE Ventures invested in Snorkel AI

QBE Ventures made a strategic investment in Snorkel AI because it provides what Insurers need: scalable and affordable ways to customize AI.

New benchmark results demonstrate value of Snorkel AI approach to LLM alignment

Snorkel researchers’ state-of-the-art methods created a 7B LLM that ranked 2nd, behind only GPT-4 Turbo, on AlpacaEval 2.0 leaderboard.

January 24, 2024

Retrieval augmented generation (RAG): a conversation with its creator

Snorkel CEO Alex Ratner spoke with Douwe Keila, an author of the original paper about retrieval augmented generation (RAG).

Dr. Bubbles, Snorkel AI's mascot
January 16, 2024

Snorkel Flow 2023.R4: enhanced UI + PDF and Databricks tools

New unified prompting UI + RAG features, PDF annotation, Databricks MLflow integration, Snorkel Flow Studio, and datasets load 2x faster!

Nick Harvey author profile
January 9, 2024

How Snorkel Flow users can register custom models to Databricks

The Databricks Model Registry integration equips Snorkel Flow users to automatically register custom, use case-specific models.

January 9, 2024

Stanford professor discusses exciting advances in foundation model evaluation

Snorkel CEO Alex Ratner chatted with Stanford Professor Percy Liang about evaluation in machine learning and in AI generally.

Dr. Bubbles, Snorkel AI's mascot
January 2, 2024

BERT models: Google’s NLP for the enterprise

LLMs have claimed the spotlight since the debut of ChatGPT, but BERT models quietly handle most enterprise production NLP tasks.

December 27, 2023

First cohort of Snorkel GenAI customers sees gains up to 54 points

In its first six months, Snorkel Foundry collaborated on high-value projects with notable companies and produced impressive results.

December 20, 2023

How to tackle advanced classification challenges using Snorkel Flow

When done right, advanced classification applications cultivate business value and automation, unlock new business lines, and reduce costs.

December 14, 2023

How to scale chatbot development with Google Dialogflow and Snorkel Flow

A brief guide on how financial institutions could use Google Dialogflow with Snorkel Flow to build better chatbots for retail banking

December 12, 2023

How predictive AI + generative AI build amazing document understanding

A proof-of-concept project that combines predictive AI + generative AI to minimize LLM’s risks while keeping their advantages.

December 5, 2023

How to fine-tune Llama 2 in Snorkel Flow

Data scientists can fine-tune Llama 2 to adapt it to specific tasks. The Snorkel Flow data development platform makes it easy to do so.

Hoang Tran portrayed.
November 28, 2023

Enterprise LLM challenges and how to overcome them

Large language models open many new opportunities for data science teams, but enterprise LLM challenges persist—and customization is key.

Hoang Tran portrayed.
November 16, 2023

LLM distillation techniques to explode in importance in 2024

LLM distillation will become a more important in 2024, according to a poll of attendees at Snorkel AI’s 2023 Enterprise LLM virtual summit.

November 9, 2023

How to fine-tune large language models for enterprise use cases

LLMs have a broad but shallow knowledge, but fall short on specialized tasks. For best performance, enterprises must fine tune their LLMs.

Hoang Tran portrayed.
November 2, 2023

Snorkel Flow 2023.R3 release: PaLM integration, streamlined onboarding, and enhanced user experience

The 2023.R3 Snorkel Flow release is packed with improvements that amplify user experience, streamline workflows, and enhance performance, ensuring our users derive unparalleled value from our platform.

Nick Harvey author profile
November 1, 2023
1 2 3 4 10