Latest posts
- Enterprise LLM challenges and how to overcome them - Large language models open many new opportunities for data science teams, but enterprise LLM challenges persist—and customization is key. ...
- LLM distillation techniques to explode in importance in 2024 - LLM distillation will become a more important in 2024, according to a poll of attendees at Snorkel AI’s 2023 Enterprise LLM virtual summit. ...
- How to fine-tune large language models for enterprise use cases - LLMs have a broad but shallow knowledge, but fall short on specialized tasks. For best performance, enterprises must fine tune their LLMs. ...
- Snorkel Flow 2023.R3 release: PaLM integration, streamlined onboarding, and enhanced user experience - The 2023.R3 Snorkel Flow release is packed with improvements that amplify user experience, streamline workflows, and enhance performance, ensuring our users derive unparalleled value from our platform. ...
- Navigating Biden’s AI executive order with AI data development - The Biden administration issued an executive order that creates new AI standards and challenges. AI data development can help. ...
- Snorkel AI researchers present 18 papers at NeurIPS 2023 - The Snorkel AI team will present 18 research papers and talks at the 2023 Neural Information Processing Systems (NeurIPS) conference from December 10-16. The Snorkel papers cover a broad range of topics including fairness, semi-supervised learning, large language models (LLMs), and domain-specific models. Snorkel AI is proud of its roots in… ...
- Two approaches to distill LLMs for better enterprise value - Distillation techniques allow enterprises to access the full predictive power of large language models at a tiny fraction of their cost. ...
- Enterprise LLM Summit highlights the importance of data development - Snorkel AI's Enterprise LLM Virtual Summit drew 1,000 attendees with speakers from Contextual AI, Google, Meta, Stanford, and Together AI. ...
Results: 33 - 40 of : 249