Snorkel AI has made building production-ready, high-value enterprise AI applications faster and easier than ever. The 2024.R3 update to our Snorkel Flow AI data development platform streamlines data-centric workflows, from easier-than-ever generative AI evaluation to multi-schema annotation.
How one large financial institution used call center AI to inform customer experience management with real-time data.
Retrieval-augmented generation (RAG) enables LLMs to produce more accurate responses by finding and injecting relevant context. Learn how.
We aim to help our customers get GenAI into production. In our 2024.R3 release, we’ve delivered some exciting GenAI evaluation results.
Snorkel AI has made building production-ready, high-value enterprise AI applications faster and easier than ever. The 2024.R3 update to our Snorkel Flow AI data development platform streamlines data-centric workflows, from easier-than-ever generative AI evaluation to multi-schema annotation.
Discover new NLP features in Snorkel Flow\’s 2024.R3 release, including named entity recognition for PDFs + advanced sequence tagging tools.
Discover the latest enterprise readiness features for Snorkel Flow. Configure safeguards for data compliance and security.
Retrieval-augmented generation (RAG) enables LLMs to produce more accurate responses by finding and injecting relevant context. Learn how.
To tackle generative AI use cases, Snorkel AI + AWS launched an accelerator program to address the biggest blocker: unstructured data.
AI alignment ensures that AI systems align with human values, ethics, and policies. Here’s a primer on how developers can build safer AI.
Snorkel takes a step on the path to enterprise superalignment with new data development workflows for enterprise alignment
How one large financial institution used call center AI to inform customer experience management with real-time data.
A customer wanted an llm system for complex contract question answering tasks. We helped them build it—beating the baseline by 64 points.
Snorkel AI helped a client solve the challenge of social media content filtering quickly and sustainably. Here’s how.
In its first six months, Snorkel Foundry collaborated on high-value projects with notable companies and produced impressive results.
Learn how Snorkel, Databricks, and AWS enabled the team to build and deploy small, specialized, and highly accurate models which met their AI production requirements and strategic goals.
“Task Me Anything” empowers data scientists to generate bespoke benchmarks to assess and choose the right multimodal model for their needs.
Introducing Alfred: an open-source tool for combining foundation models with weak supervision for faster development of academic data sets.
This release features new GenAI tools and Multi-Schema Annotation, as well as new enterprise security tools and an updated home page.
We aim to help our customers get GenAI into production. In our 2024.R3 release, we’ve delivered some exciting GenAI evaluation results.
Snorkel AI has made building production-ready, high-value enterprise AI applications faster and easier than ever. The 2024.R3 update to our Snorkel Flow AI data development platform streamlines data-centric workflows, from easier-than-ever generative AI evaluation to multi-schema annotation.
Discover new NLP features in Snorkel Flow\’s 2024.R3 release, including named entity recognition for PDFs + advanced sequence tagging tools.
Discover the latest enterprise readiness features for Snorkel Flow. Configure safeguards for data compliance and security.
Snorkel researchers devised a new way to evaluate long context models and address their “lost-in-the-middle” challenges with mediod voting.
ROBOSHOT acts like a lens on foundation models and improves their zero-shot performance without additional fine-tuning.
Microsoft infrastructure facilitates Snorkel AI research experiments, including our recent high rank on the AlpacaEval 2.0 LLM leaderboard.
Humans learn tasks better when taught in a logical order. So do LLMs. Researchers developed a way to exploit this tendency called “Skill-it!”
Snorkel AI has made building production-ready, high-value enterprise AI applications faster and easier than ever. The 2024.R3 update to our Snorkel Flow AI data development platform streamlines data-centric workflows, from easier-than-ever generative AI evaluation to multi-schema annotation.
The founding team of Snorkel AI has spent over half a decade—first at the Stanford AI Lab and now at Snorkel AI—researching data-centric techniques to overcome the biggest bottleneck in AI: The lack of labeled training data. In this video Snorkel AI co-founder Paroma Varma gives an overview of the key principles of data-centric AI development. What is data-centric AI?…
Large language models have enormous potential. But what are they? Where did they come from? And how can you make them work better?
Fine-tuning specialized LLMs demands a lot of time and cost We developed Bonito to make this process faster, cheaper, and easier.
Discover new NLP features in Snorkel Flow\’s 2024.R3 release, including named entity recognition for PDFs + advanced sequence tagging tools.
We aim to help our customers get GenAI into production. In our 2024.R3 release, we’ve delivered some exciting GenAI evaluation results.
We aim to help our customers get GenAI into production. In our 2024.R3 release, we’ve delivered some exciting GenAI evaluation results.
Snorkel AI has made building production-ready, high-value enterprise AI applications faster and easier than ever. The 2024.R3 update to our Snorkel Flow AI data development platform streamlines data-centric workflows, from easier-than-ever generative AI evaluation to multi-schema annotation.
Discover new NLP features in Snorkel Flow\’s 2024.R3 release, including named entity recognition for PDFs + advanced sequence tagging tools.
Discover the latest enterprise readiness features for Snorkel Flow. Configure safeguards for data compliance and security.
Learn how Snorkel, Databricks, and AWS enabled the team to build and deploy small, specialized, and highly accurate models which met their AI production requirements and strategic goals.
“Task Me Anything” empowers data scientists to generate bespoke benchmarks to assess and choose the right multimodal model for their needs.
Introducing Alfred: an open-source tool for combining foundation models with weak supervision for faster development of academic data sets.
Retrieval-augmented generation (RAG) enables LLMs to produce more accurate responses by finding and injecting relevant context. Learn how.
How one large financial institution used call center AI to inform customer experience management with real-time data.
This release features new GenAI tools and Multi-Schema Annotation, as well as new enterprise security tools and an updated home page.
Enterprises must evaluate LLM performance for production deployment. Custom, automated eval + data slices present the best path to production.
Meta’s Llama 3.1 405B, rivals GPT-4o in benchmarks, offering powerful AI capabilities. Despite high costs, it can enhance LLM adoption through fine-tuning, distillation, and as an AI judge.
Meta released Llama 3 405B today, signaling a new era of open source AI. The model is ready to use on Snorkel Flow.
High-performing AI systems require more than a well-designed model. They also require properly constructed training and testing data.
We need more labeled data than ever, so we have explored weak supervision for non-categorical applications—with notable results.
To tackle generative AI use cases, Snorkel AI + AWS launched an accelerator program to address the biggest blocker: unstructured data.
AI alignment ensures that AI systems align with human values, ethics, and policies. Here’s a primer on how developers can build safer AI.
The Snorkel Flow label model plays an instrumental role in driving the enterprise value we create. Here’s a peek at how it works.
Vision language models demonstrate impressive image classification capabilities, but LLMs can help improve their performance. Learn how.
Snorkel researchers devised a new way to evaluate long context models and address their “lost-in-the-middle” challenges with mediod voting.
See a walkthrough of how Snorkel Flow users build applications with production-grade RAG retrieval components.
Fine-tuning specialized LLMs demands a lot of time and cost We developed Bonito to make this process faster, cheaper, and easier.
Snorkel takes a step on the path to enterprise superalignment with new data development workflows for enterprise alignment
Snorkel Flow’s 2024.R1 release includes new role-based access control tools to further safeguard valuable enterprise data.
The manufacturing industry has experienced a massive influx of data. Snorkel AI and AWS Sage Maker can make that data actionable.
ROBOSHOT acts like a lens on foundation models and improves their zero-shot performance without additional fine-tuning.
Unlock advanced LLM customization with Snorkel Flow’s new release! Explore flexible data integrations, secure controls, and multimodal support to fine-tune language models for enterprise use. Discover how to leverage images and diverse data types for AI-driven insights.
Snorkel Flow’s new FM-first workflow for building document intelligence applications will get you from demo to production faster than ever.
Snorkel’s Paroma Varma and Google’s Ali Arsenjani discus the role of data in the development and implementation of LLMs.
We’re excited to announce Snorkel Custom to help enterprises cross the chasm from flashy chatbot demos to real production AI value.