All articles on
Research

LoRA: Low-Rank Adaptation for LLMs

Low-rank adaptation (LoRA) lets data scientists customize GenAI models like LLMs faster than traditional full fine-tuning methods.

February 21, 2024

New benchmark results demonstrate value of Snorkel AI approach to LLM alignment

Snorkel researchers’ state-of-the-art methods created a 7B LLM that ranked 2nd, behind only GPT-4 Turbo, on AlpacaEval 2.0 leaderboard.

January 24, 2024

Retrieval augmented generation (RAG): a conversation with its creator

Snorkel CEO Alex Ratner spoke with Douwe Keila, an author of the original paper about retrieval augmented generation (RAG).

Dr. Bubbles, Snorkel AI's mascot
January 16, 2024

Stanford professor discusses exciting advances in foundation model evaluation

Snorkel CEO Alex Ratner chatted with Stanford Professor Percy Liang about evaluation in machine learning and in AI generally.

Dr. Bubbles, Snorkel AI's mascot
January 2, 2024

Snorkel AI researchers present 18 papers at NeurIPS 2023

The Snorkel AI team will present 18 research papers and talks at the 2023 Neural Information Processing Systems (NeurIPS) conference from December 10-16. The Snorkel papers cover a broad range of topics including fairness, semi-supervised learning, large language models (LLMs), and domain-specific models. Snorkel AI is proud of its roots in the research community and endeavors to remain at the forefront…

Dr. Bubbles, Snorkel AI's mascot
October 31, 2023

Two approaches to distill LLMs for better enterprise value

Distillation techniques allow enterprises to access the full predictive power of large language models at a tiny fraction of their cost.

Jason Fries Headshot
October 31, 2023

Bloomberg’s Gideon Mann on the power of domain specialist LLMs

Gideon Mann, head of ML Product and Research at Bloomberg LP, chatted with Snorkel CEO Alex Ratner about building BloombergGPT.

Dr. Bubbles, Snorkel AI's mascot
October 17, 2023

Which is better, retrieval augmentation (RAG) or fine-tuning? Both.

Professionals in the data science space often debate whether RAG or fine-tuning yields the better result. The answer is “both.”

Hoang Tran portrayed.
September 20, 2023

Former U.S. Chief Data Scientist on past and future of data science

Past U.S. Chief Data Scientist DJ Patil talked with Snorkel AI CEO Alex Ratner on topics including the origin of the title “data scientist.”

Dr. Bubbles, Snorkel AI's mascot
September 12, 2023

4 new papers show foundation models can build on themselves

The surest way to improve foundation models is through more and better data, but Snorkel researchers showed FMs can learn from themselves.

August 31, 2023

Accelerating predictive task time to value with generative AI

Generative AI can write poems, recite common knowledge, and extract information. GenAI can also help quickly build predictive pipelines.

August 17, 2023
August 4, 2023

Data fuels enterprise AI value: 6 takeaways from the Gartner Hype Cycle for Artificial Intelligence, 2023

GenAI may be the most transformative technology of the past decade but data is where enterprises are able to realize real value from AI today.

August 2, 2023

How we built better GenAI with programmatic data development

We used weak supervision to programmatically curate instruction tuning data for open-source LLMs to build a better GenAI.

July 19, 2023

The future of large language models is faster and more robust

Snorkel and affiliated academic labs have been hard at work reducing how computationally expensive large language models are.

June 29, 2023
1 2 3 4 7
Image
See how Snorkel can help you get up to:
100x

Faster Data Curation

40x
Faster Model Delivery
99%
Model Accuracy