RAG: Boost LLM performance with retrieval-augmented generation

Retrieval-augmented generation (RAG) enables LLMs to produce more accurate responses by finding and injecting relevant context. Learn how.

Matt Casey
August 15, 2024

Latest posts

  • Data extraction from SEC filings (10-Ks) with Snorkel Flow
    May 10, 2022Jonathan Dahlberg
    - Leveraging Snorkel Flow to extract critical data from annual quarterly reports (10-Ks) Introduction It can surprise those who have never logged into EDGAR how much information is available in annual reports from public companies. You can find tactical details like the names of senior leadership, top shareholders, and more strategic information like… ...
  • Liger: Fusing foundation model embeddings & weak supervision
    May 9, 2022Team Snorkel
    - Showcasing Liger—a combination of foundation model embeddings to improve weak supervision techniques. Machine learning whiteboard (MLW) open-source series In this talk, Mayee Chen, a PhD student in Computer Science at Stanford University focuses on her work combining weak supervision and foundation model embeddings that improve two essential aspects of current… ...
  • AI in cybersecurity an introduction and case studies
    May 5, 2022Nic Acton
    - An introduction to AI in cybersecurity with real-world case studies in a Fortune 500 organization and a government agency Despite all the recent advances in artificial intelligence and machine learning (AI/ML) applied to a vast array of application areas and use cases, success in AI in cybersecurity remains elusive. The… ...
  • Active learning: an overview
    May 4, 2022Josh McGrath
    - A primer on active learning presented by Josh McGrath. Machine learning whiteboard (MLW) open-source series This video defines active learning, explores variants and design decisions made within active learning pipelines, and compares it to related methods. It contains references to some seminal papers in machine learning that we find instructive.… ...
  • Using few-shot learning language models as weak supervision
    May 3, 2022Ryan Smith
    - Utilizing large language models as zero-shot and few-shot learners with Snorkel for better quality and more flexibility Large language models (LLMs) such as BERT, T5, GPT-3, and others are exceptional resources for applying general knowledge to your specific problem. Being able to frame a new task as a question for… ...
  • Accelerating AI in healthcare
    April 29, 2022Team Snorkel
    - How can data-centric AI speeds your end-to-end healthcare AI development and deployment Healthcare is a field that is awash in data, and managing it all is complicated and expensive. As an industry, it benefits tremendously from the ongoing development of machine learning and data-centric AI. The potential benefits of AI… ...
  • Bill of materials for responsible AI: collaborative labeling
    April 28, 2022Alexis Zumwalt
    - In our previous posts, we discussed how explainable AI is crucial to ensure the transparency and auditability of your AI deployments and how trustworthy AI adoption and its successful integration into our country’s critical infrastructure and systems are paramount. In this post, we dive into making trustworthy and responsible AI possible with Snorkel Flow,… ...
  • ICLR 2022 recap from Snorkel AI
    April 20, 2022Braden Hancock
    - We are honored to be part of the International Conference on Learning Representations (ICLR) 2022, where Snorkel AI founders and researchers will be presenting five papers on data-centric AI topics The field of artificial intelligence moves fast!  Hardly a month goes by without exciting new state-of-the-art techniques, results, datasets, and… ...
Results: 209 - 216 of : 272
  • Request demo

  • See Snorkel Flow’s data-centric AI workflow in action

  • Snorkel Events

    Learn how enterprises can harness the power of LLMs and use their data to deliver value with genAI.

    Watch on demand